I have a fanuc M-430iA with a Fanuc R-30iA controller and a sony xc-56.
I'm trying to set up a vision system where the robot picks up wooden blocks (with different letters on it) from a conveyer belt and puts them into a specific order to form a name.
To test some basic things I thought I'd test the process without the belt moving to make sure the camera can find the blocks and that the robot gets the correct locations and can pick them up.
The program looks like this:
UTOOL_NUM=1
UFRAME_NUM=8
J P[1] 100% FINE <-- random point 'home position'
CALL GRIP_OPEN
LBL[99]
VISION RUN_FIND 'LETTERS'
VISION GET_OFFSET 'LETTERS' VR[2] JMP LBL[99] <-- store offset in vision register
PR[5] = VR[2].OFFSET <-- move offset from vision register to position register
J P[2] 40% FINE Offset,PR[5] <-- position above the block (with the position register offset)
L P[3] 100 mm/sec,PR[5] <-- grip position (with the position register offset)
CALL GRIP_CLOSE
Point 2 and 3 are touched-up with the reference position.
What happens is:
The letters get recognised, and everytime I move the block within the FOV withouth rotating, it will pick up the part (nearly) perfect.
However when I rotate the part in any way, the position will be off (in x, y, and rotation) but still around the FOV.
I used one of the wooden blocks (with a nail in it) and lined it up for calibration, but the nail is not 100% straight.
Many people have told me that the tool frame and TCP are very important when calibrating.
But I don't really get the science, doesn't the origin of the part get lined up with the TCP when using the vision system?
So when the part rotates, how can the origin move away? Because non rotated parts are picked up at almost all locations.
Can this problem get caused by the nail not being completely straight?
And how can the positions only be off when the part is rotating?
Thanks in advance!