iRvision precision

  • I have a fanuc M-430iA with a Fanuc R-30iA controller and a sony xc-56.
    I'm trying to set up a vision system where the robot picks up wooden blocks (with different letters on it) from a conveyer belt and puts them into a specific order to form a name.
    To test some basic things I thought I'd test the process without the belt moving to make sure the camera can find the blocks and that the robot gets the correct locations and can pick them up.


    The program looks like this:


    UTOOL_NUM=1
    UFRAME_NUM=8
    J P[1] 100% FINE <-- random point 'home position'
    CALL GRIP_OPEN
    LBL[99]
    VISION RUN_FIND 'LETTERS'
    VISION GET_OFFSET 'LETTERS' VR[2] JMP LBL[99] <-- store offset in vision register
    PR[5] = VR[2].OFFSET <-- move offset from vision register to position register
    J P[2] 40% FINE Offset,PR[5] <-- position above the block (with the position register offset)
    L P[3] 100 mm/sec,PR[5] <-- grip position (with the position register offset)
    CALL GRIP_CLOSE


    Point 2 and 3 are touched-up with the reference position.


    What happens is:
    The letters get recognised, and everytime I move the block within the FOV withouth rotating, it will pick up the part (nearly) perfect.
    However when I rotate the part in any way, the position will be off (in x, y, and rotation) but still around the FOV.


    I used one of the wooden blocks (with a nail in it) and lined it up for calibration, but the nail is not 100% straight.


    Many people have told me that the tool frame and TCP are very important when calibrating.


    But I don't really get the science, doesn't the origin of the part get lined up with the TCP when using the vision system?
    So when the part rotates, how can the origin move away? Because non rotated parts are picked up at almost all locations.


    Can this problem get caused by the nail not being completely straight?
    And how can the positions only be off when the part is rotating?


    Thanks in advance! :help:

  • First, make sure your gripper TCP is accurate.


    Then, you should not store the vision offset into a position register to use as a normal offset. Instead just use Voffset on your point directly.


    If that doesn't solve your issue then I would say it's probable that your calibration was not done correctly.



    **Edit**


    I came back to edit the post to answer your question about the "Science" of what is going on. It is a very common error I see when a robotic vision system is only accurate in X, Y, Z but not rotations or vice versa. There are different things at play that can cause only a translational or only a rotational error.


    Here is an analogy that may help you understand: Say that I place your hand on a block and tell you that this is the starting point. Then I can tell you to move left, right, forward, or backwards from that place to get to another block. All good so far. Then say I tell you to rotate to get the block, but instead of rotating your hand you rotate hinging from your elbow. Now you completely miss the block because you rotated about the wrong point. The robot is the same way. If you don't do everything correctly it will rotate around the wrong point. But if you don't have to rotate you can still be accurate because you are just moving left, right, forward, backward an offset amount from the starting point. (A broken clock is still right twice a day!)


    What can cause rotational errors on a vision pick application? Most likely issues are bad TCP, bad camera calibration, or incorrect programming of the offset. There are other issues that will affect your accuracy such as user frame and mastering. But I would start with what I listed above, then let us know what you find out.

    Edited once, last by HawkME ().

  • First of all, thanks for your extensive explanation. It's starting to make sense now.


    I tried using Voffset, however when I do that the robot went completely out of place (far out the FOV).
    And the Gripper is set up correctly.


    So it must be the calibration then, because I tried jogging the (self made) pointer tool in TOOL COORD and the pointer is off a couple of milimeters. Especially when rotating around the z-axis.
    Maybe I should advice the owner to buy a professional pointer.


    For now I just snapped the square block on 4, 90 deg, angles.
    This seems to work out pretty well, but comes with complicity when making a program (orientation requires different ID's).

  • I've been trying the process with linetracking now (conveyor on).


    Everything is working good, but when I change the conveyor speed this happens when picking up one of the blocks:


    Stationary:


    Speed=2


    Speed=4


    Speed=6


    Can this be caused by the exposure time? (maybe to high)
    Or is this lagg that can't be fixed? Because if that's the case I will need to run the conveyor on the same speed everytime and adjust the offset.


    Thanks in advance! :help:

  • When you switch offset methods you have to reteach the all points that are used with the offset. Looks like just P[2] in the code you posted.


    The other thing is to make sure you have the correct UT and UF selected. You have User Tool 1 selected; does that correspond to the center point of your gripper?


  • When you switch offset methods you have to reteach the all points that are used with the offset. Looks like just P[2] in the code you posted.


    The other thing is to make sure you have the correct UT and UF selected. You have User Tool 1 selected; does that correspond to the center point of your gripper?


    First I made a new reference, played the run_find & get_offset and then I reteached the point with the touch-up method.


    The User tool I used is Utool=1 which was already set up for the gripper before I started working on it. I don't think that they set it up wrong, but is there a way to test it? It's a gripper so I can't really move it around 1 point.
    And the User frame I used is Uframe 8. I created this user frame when calibrating the camera (without moving the calibration grid).


  • If the application frame is set to 0 check to make sure the XY of the calibration grid matches the world frame of the robot.


    I'm using the grid calibration user frame for this project.

  • Dutchbrother
    I merged your two posts.


    Regarding your issue.
    I work with iRPicktool which integrates vision and line tracking, Anyway, on your comments I never see you mentioning how do you deal/integrate the pulse counts to the vision. I think that's the problem, somewhere in the tracking and vision you are missing the link

    Retired but still helping


  • Dutchbrother
    I merged your two posts.


    Regarding your issue.
    I work with iRPicktool which integrates vision and line tracking, Anyway, on your comments I never see you mentioning how do you deal/integrate the pulse counts to the vision. I think that's the problem, somewhere in the tracking and vision you are missing the link


    Yeah sorry, the reason I made a new post is because it's regarding a different problem.
    The 2nd problem (with the blocks being out of the gripper center) occured when I adjusted the program for tracking:


    UTOOL_NUM=1
    UFRAME_NUM=8
    LINE[1]=ON
    J P[1] 100% FINE <-- random point 'home position'
    CALL GRIP_OPEN
    LBL[99]
    VISION RUN_FIND 'LETTERS'
    VISION GET_OFFSET 'LETTERS' VR[2] JMP LBL[99]
    PR[5] = VR[2].OFFSET
    LINECOUNT[1]=R[1]
    SETTRIG LNSCH[1] R[1]
    SELBOUND LNSCH[1] BOUND[1]
    CALL TRACKING


    And I made a seperate program 'TRACKING' that includes only the tracking part.


    J P[2] 40% FINE Offset,PR[5] <-- position above the block (with the position register offset)
    L P[3] 100 mm/sec Offset,PR[5] <-- grip position (with the position register offset)
    CALL GRIP_CLOSE


    Within the program details of this 2nd program I can enter a line tracking schedule which includes the encoder count.
    I can see that it takes that into count, because it is always grabs the part, it just grabs it further apart from the center when I speed up the conveyor.

  • First I made a new reference, played the run_find & get_offset and then I reteached the point with the touch-up method.


    The User tool I used is Utool=1 which was already set up for the gripper before I started working on it. I don't think that they set it up wrong, but is there a way to test it? It's a gripper so I can't really move it around 1 point.
    And the User frame I used is Uframe 8. I created this user frame when calibrating the camera (without moving the calibration grid).


    You can check your gripper TCP. Here is a simple method.


    With the correct UT number selected and in User jog coordinates, carefully jog the gripper down to a sheet of paper and trace the outside of it with a pen or marker. Then jog straight up and rotate 180 degrees in user coordinates. Jog back down to the paper and see if it lines up with your tracing. If the gripper is symmetrical (which it looks to be in your picture) and TCP is correct in X, and Y then it should line up perfectly. If you don't want to use paper then you could instead use blocks to mark the location. If the TCP does not line up when rotated then you will not be able to pick rotated parts accurately.

  • Good Tip, I will try that!
    Does it have to be in USER Coord? Or can I do it in JGFRM aswell and then rotate joint 5 (last joint)?
    I don't really know what USER coord is for.

  • Make sure the correct user tool and user frame is selected. I would do it in User coord to make it easier to square up with your table/sheet of paper. Do not rotate in Joint coord, because joint mode is independent of your TCP. Looking at your picture it will probably be jogging rotation about Z (the z rotation or J6 button on the TP).


  • Make sure the correct user tool and user frame is selected. I would do it in User coord to make it easier to square up with your table/sheet of paper. Do not rotate in Joint coord, because joint mode is independent of your TCP. Looking at your picture it will probably be jogging rotation about Z (the z rotation or J6 button on the TP).


    Thanks! The gripper seems to be setup correctly.

  • And the 'lagg' problem has been solved.
    The solution was to put the LINECOUNT and the SETTRIG lines in between the RUN_FIND and GET_OFFSET:


    VISION RUN_FIND 'LETTERS'
    LINECOUNT[1]=R[1]
    SETTRIG LNSCH[1] R[1]
    VISION GET_OFFSET 'LETTERS' VR[2] JMP LBL[99]
    PR[5] = VR[2].OFFSET
    SELBOUND LNSCH[1] BOUND[1]


    I can now run it on a speed of 9. :bravo:

Advertising from our partners