I have a fanuc M-430iA with a Fanuc R-30iA controller and a sony xc-56.
I'm trying to set up a vision system where the robot picks up wooden blocks (with different letters on it) from a conveyer belt and puts them into a specific order to form a name.
To test some basic things I thought I'd test the process without the belt moving to make sure the camera can find the blocks and that the robot gets the correct locations and can pick them up.
The program looks like this:
J P 100% FINE <-- random point 'home position'
VISION RUN_FIND 'LETTERS'
VISION GET_OFFSET 'LETTERS' VR JMP LBL <-- store offset in vision register
PR = VR.OFFSET <-- move offset from vision register to position register
J P 40% FINE Offset,PR <-- position above the block (with the position register offset)
L P 100 mm/sec,PR <-- grip position (with the position register offset)
Point 2 and 3 are touched-up with the reference position.
What happens is:
The letters get recognised, and everytime I move the block within the FOV withouth rotating, it will pick up the part (nearly) perfect.
However when I rotate the part in any way, the position will be off (in x, y, and rotation) but still around the FOV.
I used one of the wooden blocks (with a nail in it) and lined it up for calibration, but the nail is not 100% straight.
Many people have told me that the tool frame and TCP are very important when calibrating.
But I don't really get the science, doesn't the origin of the part get lined up with the TCP when using the vision system?
So when the part rotates, how can the origin move away? Because non rotated parts are picked up at almost all locations.
Can this problem get caused by the nail not being completely straight?
And how can the positions only be off when the part is rotating?
Thanks in advance!