1. Home
    1. Dashboard
    2. Search
  2. Forum
    1. Unresolved Threads
    2. Members
      1. Recent Activities
      2. Users Online
      3. Team Members
      4. Search Members
      5. Trophys
  3. Articles
  4. Blog
  5. Videos
  6. Jobs
  7. Shop
    1. Orders
  • Login or register
  • Search
This Thread
  • Everywhere
  • This Thread
  • This Forum
  • Articles
  • Pages
  • Forum
  • Blog Articles
  • Products
  • More Options
  1. Robotforum - Support and discussion community for industrial robots and cobots
  2. Forum
  3. Industrial Robot Support and Discussion Center
  4. Fanuc Robot Forum
Your browser does not support videos RoboDK Software for simulation and programming
Visit our Mainsponsor
IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Sponsored Ads

Robot guidance using a contrologix plc and a cognex camera

  • IRockWell
  • May 8, 2023 at 1:46 AM
  • Thread is Unresolved
  • IRockWell
    Reactions Received
    4
    Trophies
    3
    Posts
    104
    • May 8, 2023 at 1:46 AM
    • #1

    I am trying to guide a r2000iC robot on a 30iB (non plus) controller with a cognex 8405 camera.

    I calibrated the camera and I am using on offset in the user frame that is at the same z height as the calibration for the camera. Works great for x and y. rotation was a little bit off a hickup until I realized that I need to have the origin of the camera pattern be at the same x and y coordinates as the tcp. Now rotation works fine as well.

    What I am worrying about is maintenance or whatever reteaching the pick location (or the vision program). We have other cognex to fanuc guidance applications which were created by an external contractor using karel. Seems like he is manipulating the user frame instead of using an offset. Is that a common practice? Is there a best practice, ideally using .tp? I am not a big fan of karel, as the .kl will get lost if it even provided.

  • HawkME
    Reactions Received
    568
    Trophies
    11
    Posts
    3,268
    • May 8, 2023 at 2:14 PM
    • #2

    It is the same issue as using Found position vs Voffset in a Fanuc iRVision application. It won't do the math for you to make things correct. You have found the exception that things still work out if you are centered.

    You have 3 choices to make a general solution.

    1. Make sure your vision surface is coplaner to the robots world frame. Output your vision position in world coordinates, then set a user frame equal to that output.

    2. Use matrix multiplication to multiply your vision user frame by the found position.

    3. Do an air move to the offset position, then set a UF = LPOS

    #2 is the best solution but either requires Karel or you can do it in TP with the Vision Support Tools option.

    #1 only works if you mechanically level your fixture so W and P angles are level to the robot world frame. 2D vision only gives you X, Y, and R. You you must physically account for Z, W, and P. Z is simply entering the correct value, but W And P must be level.

    #3 is the poor man's way out. It works perfectly fine but looks stupid and wastes cycle time.

  • IRockWell
    Reactions Received
    4
    Trophies
    3
    Posts
    104
    • May 9, 2023 at 12:45 AM
    • #3

    Thanks a lot! Can't wait to try those out!

    If i understand 3 correctly, it would move let's say 100mm above the pick location, then set the uframe, rotate, then move down 100mm? I think that would look awesome, and cycle time should not be an issue.

    Could you elaborate on 2? I did linear algebra in college, but it's been a while and was my only d.

    Am I good if the user frame is as the same x y directions and z height as the camera calibration ie does the origin matter? What do i do with the result of the matrix multiplication?

    Also, do you have a "buy me a beer' PayPal or something? You helped me so much over the last couple of years , either with answers to my questions or even more with answers to other peoples questions that it makes me feel bad.

  • HawkME
    Reactions Received
    568
    Trophies
    11
    Posts
    3,268
    • May 9, 2023 at 3:03 AM
    • #4

    #3, I'm exaggerating a bit when I said it wouldn't look good. You would need to do a Fine move with the vision x, y and R offset first, above your part. Basically move to the raw found position but with added Z, while in your calibration UF. Then switch your user frame to 0, or one that is equivalent to world. Then take LPOS. Then set a new UF = LPOS. Switch to that new UF then any positions there after, you or maintenance are free to touch up as needed.

    #2 you set a new UF equal to the UF that was used for your vision calibration multiplied by your vision found position. UfNew = UfVis X Found pos. Then switch to the new UF and teach all pick positions from there.

    All 3 solutions are doing the same thing, but a different way. All UFs are defined relative to world. So you just need a way to convert the vision found position to world and it defines your new frame.

    I haven't setup a PayPal for that but now that you mention it I could use a beer. :beerchug:

  • Nation
    Typical Robot Error
    Reactions Received
    538
    Trophies
    9
    Posts
    1,919
    • May 9, 2023 at 7:37 PM
    • #5

    For #3, you could even manipulate the $MCR_GRP[1].$MACHINELOCK system var to make the air move invisible to the user. It would still need some cycle time though.

    Check out the Fanuc position converter I wrote here! Now open source!

    Check out my example Fanuc Ethernet/IP Explicit Messaging program here!

Advertising from our partners

IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Advertise in Robotics
Advertise in Robotics

Job Postings

  • Anyware Robotics is hiring!

    yzhou377 February 23, 2025 at 4:54 AM
  • How to see your Job Posting (search or recruit) here in Robot-Forum.com

    Werner Hampel November 18, 2021 at 3:44 PM
Your browser does not support videos RoboDK Software for simulation and programming

Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • robodk
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000

Thread Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • robodk
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000
  1. Privacy Policy
  2. Legal Notice
Powered by WoltLab Suite™
As a registered Member:
* You will see no Google advertising
* You can translate posts into your local language
* You can ask questions or help the community with your knowledge
* You can thank the authors for their help
* You can receive notifications of replies or new topics on request
* We do not sell your data - we promise

JOIN OUR GREAT ROBOTICS COMMUNITY.
Don’t have an account yet? Register yourself now and be a part of our community!
Register Yourself Lost Password
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on Google Play
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on the App Store
Download