1. Home
    1. Dashboard
    2. Search
  2. Forum
    1. Unresolved Threads
    2. Members
      1. Recent Activities
      2. Users Online
      3. Team Members
      4. Search Members
      5. Trophys
  3. Articles
  4. Blog
  5. Videos
  6. Jobs
  7. Shop
    1. Orders
  • Login or register
  • Search
This Thread
  • Everywhere
  • This Thread
  • This Forum
  • Articles
  • Pages
  • Forum
  • Blog Articles
  • Products
  • More Options
  1. Robotforum - Support and discussion community for industrial robots and cobots
  2. Forum
  3. Industrial Robot Support and Discussion Center
  4. KUKA Robot Forum
Your browser does not support videos RoboDK Software for simulation and programming
Visit our Mainsponsor
IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Sponsored Ads

Applying Vision Offsets with a KUKA Robot

  • TheCarebear
  • January 15, 2025 at 9:24 PM
  • Thread is Resolved
  • TheCarebear
    Trophies
    1
    Posts
    4
    • January 15, 2025 at 9:24 PM
    • #1

    Hey Team,

    I'm extremely new to KUKA robots and we are tasked with applying 60 different offsets (X,Y,Z) in the same program. We will get our vision offsets from a Keyonce camera. The PLC will tell us what the values are. We will have the same 5 ptp path that then dispenses 60 times but will most likely have 60 different offsets, for 60 different starting points.


    Edit: Baseframe 1. Robot and vision base frame should be the same.

    Update: All of the offsets are going to offsets from X,Y,Z (0,0,0)


    My Question is; How do we ? What is the best method ?


    Can anyone explain to me how we do this ? :loudly_crying_face: #Offsets

    Edited 2 times, last by TheCarebear: New information to provide. (January 15, 2025 at 10:33 PM).

  • Jack Bauer
    Reactions Received
    1
    Posts
    16
    • January 15, 2025 at 9:48 PM
    • Best Answer
    • #2

    hello, kuka expert freelancer, hope we can cooperate with u company in long term

    code:

    ; Define the offsets for each of the 60 positions

    DECL E6POS p1_offset, p2_offset, p3_offset, p4_offset, p5_offset

    DECL E6POS base_position[60]

    ; Retrieve offsets for each point (these values come from the PLC)

    FOR i = 1 TO 60

    base_position[i] = P1

    base_position[i].X = P1.X + offsets[i].X

    base_position[i].Y = P1.Y + offsets[i].Y

    base_position[i].Z = P1.Z + offsets[i].Z ;

    Repeat for P2, P3, P4, P5, etc.

    ENDFOR

  • TheCarebear January 15, 2025 at 10:35 PM

    Selected a post as the best answer.
  • Online
    SkyeFire
    Reactions Received
    1,042
    Trophies
    12
    Posts
    9,388
    • January 16, 2025 at 3:47 PM
    • #3

    The first questions are:

    Is this camera fixed, or mounted to the robot?

    Are these offsets in a fixed reference frame (Base), or a Tool offset?

    What Base is your path taught in?

    We need to see what your path program looks like, and your I/O setup between the robot and Keyence.

    Typically, you teach the path and the vision to a part that becomes the 0 reference -- that is, the vision system produces offsets of 0 for a part exactly there, and the path (dispense in your case?) is taught to that part.

    Then, ideally, when the part moves, the vision system measures the difference between the new position and the 0-reference position, and sends offsets to the robot. The robot applies those offsets, and runs the same path.

    If we assume a fixed camera, and that the vision system and robot have already been calibrated to each other (sometimes called hand-eye calibration), then you're probably using Base offsets. Something like:

    Code
    DECL FRAME _fShift ; create temporary Frame for offset
    _fShift = $NULLFRAME ; init to all 0s
    _fShift.X = KeyenceX ; get the X offset from the vision
    _fShift.Y = KeyenceY ; get the Y offset from the vision
    _fShift.Z = KeyenceZ ; get the Z offset from the vision
    
    BASE_DATA[2] = BASE_DATA[1] ; assuming the "nominal" frame is Base 1
    BASE_DATA[2] = BASE_DATA[2] : _fShift ; apply the shifts to Base 2
    $BASE = BASE_DATA[2] ; activate Base 2
    PTP P1
    LIN P2 C_DIS
    LIN P3 C_DIS
    LIN P4 C_DIS
    Display More

    Applying offsets to Frames in KRL is done using the Geometric Operator, which has a lot of discussions in the forum archives. But this is the basic idea. If you ran this code and KeyenceX was +10, the path would be run 10mm from where it was originally taught, on the X+ axis of Base 1.

    You don't want to change your nominal Base, which is why in this example I copy it to Base 2, then shift Base 2, then run the program with Base 2 active. That (hopefully) prevents Base 1 from being corrupted by a programming error.

  • TheCarebear
    Trophies
    1
    Posts
    4
    • January 16, 2025 at 4:30 PM
    • #4
    Quote from SkyeFire

    Is this camera fixed, or mounted to the robot?

    It is NOT mounted to the robot, its on a Gantry.

    Quote from SkyeFire

    Are these offsets in a fixed reference frame (Base), or a Tool offset?

    Fixed reference base. Should be 0,0,0 for X,Y,Z.

    Quote from SkyeFire

    What Base is your path taught in?

    Probably base 0 or 1, we are matching the camera base to the robot base so they will be the same.


    Quote from SkyeFire

    We need to see what your path program looks like, and your I/O setup between the robot and Keyence.

    We have not established the I/O setup as of yet, the Robot is going to talk to the PLC and we will get that info from the PLC.

    Thank you so much for the tips..... I will look deeper into Geometric Operator.:smiling_face_with_halo:

  • Online
    SkyeFire
    Reactions Received
    1,042
    Trophies
    12
    Posts
    9,388
    • January 16, 2025 at 10:12 PM
    • #5
    Quote from TheCarebear

    Probably base 0 or 1, we are matching the camera base to the robot base so they will be the same.

    Mmm... Base 0 doesn't always work. Depends on what vision system you're using. It also means that your offsets are being calculated from a reference point fairly far away from the actual working position, which means small rotations (like minor noise or rounding errors in the offset measurement) incur larger motion errors. Ideally, the origin of the reference frame should be as close to the work as is practical.

    Typical approach is to put the calibration grid on the work table (so, directly under the camera), then perform the robot and camera calibration to that grid without moving it. That generally ties the reference frames of both robot and camera together nicely. This often puts the origin of the shared reference frame in the middle of the work area, so any rounding errors tend to get averaged out evenly.

    Also, I noticed you said XYZ offsets. Are you using 2D or 3D? Typically from a 2D overhead camera you get X, Y, and Rz -- Z offsets from a 2D camera are relatively rare, though not unheard of, and generally rely on measuring the apparent "size" of the target to get a rough value for how much closer or further it is from the camera than the 0-reference part.

  • mil3k
    Reactions Received
    15
    Trophies
    3
    Posts
    71
    • January 29, 2025 at 10:48 PM
    • #6

    Why do you want to use PLC in the middle between Keyence (which one?) and robot? Ask your Keyence support for all manuals for your vision system and read them. Keyence vision systems CV-X and XG-X both can control robots as slaves as long as you run provided solution.

    I hope your camera is not IV series, as these are not vision systems but vision sensors and can not give you coordinates.

Advertising from our partners

IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Advertise in Robotics
Advertise in Robotics

Job Postings

  • Anyware Robotics is hiring!

    yzhou377 February 23, 2025 at 4:54 AM
  • How to see your Job Posting (search or recruit) here in Robot-Forum.com

    Werner Hampel November 18, 2021 at 3:44 PM
Your browser does not support videos RoboDK Software for simulation and programming

Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • robodk
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000

Thread Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • robodk
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000

Tags

  • how
  • How?
  • offsets
  • How? Offsets

Users Viewing This Thread

  • 1 Guest
  1. Privacy Policy
  2. Legal Notice
Powered by WoltLab Suite™
As a registered Member:
* You will see no Google advertising
* You can translate posts into your local language
* You can ask questions or help the community with your knowledge
* You can thank the authors for their help
* You can receive notifications of replies or new topics on request
* We do not sell your data - we promise

JOIN OUR GREAT ROBOTICS COMMUNITY.
Don’t have an account yet? Register yourself now and be a part of our community!
Register Yourself Lost Password
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on Google Play
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on the App Store
Download