Robot guidance using a contrologix plc and a cognex camera

  • I am trying to guide a r2000iC robot on a 30iB (non plus) controller with a cognex 8405 camera.


    I calibrated the camera and I am using on offset in the user frame that is at the same z height as the calibration for the camera. Works great for x and y. rotation was a little bit off a hickup until I realized that I need to have the origin of the camera pattern be at the same x and y coordinates as the tcp. Now rotation works fine as well.


    What I am worrying about is maintenance or whatever reteaching the pick location (or the vision program). We have other cognex to fanuc guidance applications which were created by an external contractor using karel. Seems like he is manipulating the user frame instead of using an offset. Is that a common practice? Is there a best practice, ideally using .tp? I am not a big fan of karel, as the .kl will get lost if it even provided.

  • It is the same issue as using Found position vs Voffset in a Fanuc iRVision application. It won't do the math for you to make things correct. You have found the exception that things still work out if you are centered.


    You have 3 choices to make a general solution.


    1. Make sure your vision surface is coplaner to the robots world frame. Output your vision position in world coordinates, then set a user frame equal to that output.


    2. Use matrix multiplication to multiply your vision user frame by the found position.


    3. Do an air move to the offset position, then set a UF = LPOS


    #2 is the best solution but either requires Karel or you can do it in TP with the Vision Support Tools option.


    #1 only works if you mechanically level your fixture so W and P angles are level to the robot world frame. 2D vision only gives you X, Y, and R. You you must physically account for Z, W, and P. Z is simply entering the correct value, but W And P must be level.


    #3 is the poor man's way out. It works perfectly fine but looks stupid and wastes cycle time.

  • Thanks a lot! Can't wait to try those out!


    If i understand 3 correctly, it would move let's say 100mm above the pick location, then set the uframe, rotate, then move down 100mm? I think that would look awesome, and cycle time should not be an issue.


    Could you elaborate on 2? I did linear algebra in college, but it's been a while and was my only d.

    Am I good if the user frame is as the same x y directions and z height as the camera calibration ie does the origin matter? What do i do with the result of the matrix multiplication?


    Also, do you have a "buy me a beer' PayPal or something? You helped me so much over the last couple of years , either with answers to my questions or even more with answers to other peoples questions that it makes me feel bad.

  • #3, I'm exaggerating a bit when I said it wouldn't look good. You would need to do a Fine move with the vision x, y and R offset first, above your part. Basically move to the raw found position but with added Z, while in your calibration UF. Then switch your user frame to 0, or one that is equivalent to world. Then take LPOS. Then set a new UF = LPOS. Switch to that new UF then any positions there after, you or maintenance are free to touch up as needed.


    #2 you set a new UF equal to the UF that was used for your vision calibration multiplied by your vision found position. UfNew = UfVis X Found pos. Then switch to the new UF and teach all pick positions from there.


    All 3 solutions are doing the same thing, but a different way. All UFs are defined relative to world. So you just need a way to convert the vision found position to world and it defines your new frame.


    I haven't setup a PayPal for that but now that you mention it I could use a beer. :beerchug:

  • For #3, you could even manipulate the $MCR_GRP[1].$MACHINELOCK system var to make the air move invisible to the user. It would still need some cycle time though.

    Check out the Fanuc position converter I wrote here! Now open source!

    Check out my example Fanuc Ethernet/IP Explicit Messaging program here!

Advertising from our partners