So Ive been having some issues with calibrating my robots. I have to do alot of touch ups to my Roboguide created programs. My method of calibrating is using the 3 step method on the calibration tab, then I apply the offset to the actual robot due to having multiple parts/fixtures that get swapped in and out. I get pretty close and sometimes i run this method a few times to get it close as i can. Then Ill redo my Uframe in roboguide after the calibration offset settings have been applied to the robot. Im thinking this is where I might be missing something. The real robots already have Uframe set up. So I have used the same Uframe number in roboguide and is the real robot (Uframe_NUM =1) and this has my program points way off (pic 1). Then I have copied the Uframe settings from the real robot into my roboguide uframe settings (pic2) this gets my calibration closer but still not accurate. What methods are you guys using and what am i missing in my approach, or doing wrong?
Roboguide virtual and real robot calibration
-
ORDEP81 -
August 13, 2018 at 4:14 PM -
Thread is marked as Resolved.
-
-
Anyone?
-
You actually need to calibrate the position of your cell elements with respect to the robot
You teach three points in the virtual cell to locate the part or object
Then you transfer the calibration program to the real robot and touch up the positions
Save the program, load it to the same directory then run the calibration tool from the modified program
Roboguide would make a best fit and object/part would be relocated to reflect the real position.
For this to work, USER Frame should be the same both in the real and virtual controllers
The CAD model of the part should also
be accurateIf the real part and it’s CAD model are different the calibration would not be accurate
Sent from my iPhone using Tapatalk
-
When you say, User Frame should be the same in both the real and virtual controllers, how do I go about doing this?
-
The only thing that matters is the relative position of the part with respect to the robot.
For example the real robot might not be perfectly level as the model
But there is no need to locate everything, unless you are simulating a multi-robot system for example.
The USER frame is linked to the robot so you need to input the same values in the virtual robot.
The you follow the calibration procedure to locate the part model.
You can input the frame values in the virtual robot or load the frame system file from the real robot to roboguide.
If you create a simulation from a backup, the FRAME settings are also copied.
Sent from my iPhone using Tapatalk
-
So Ive been doing the calibration procedure and apply the offset to the robot. Then I redo the User Frame 3 point method.
So what I should be doing is:1. Do calibration procedure
2. apply calibration offset
3. Input Uframe number from real robot to roboguide via direct entry? -
USER frame definition MUST be done BEFORe
Because the calibration program is created using the Frame
Then when you load the touched up calibrated program the offset is calculated to update the object/part positionSent from my iPhone using Tapatalk
-
USER frame definition MUST be done BEFORe
Because the calibration program is created using the Frame
Then when you load the touched up calibrated program the offset is calculated to update the object/part position
kWhen I do the calibration program it always uses UFrame 0. When I apply the offset i do it to the robot itself as my object/part is a cradle that is in a horizontal position and applying the offset to it would ruin the position when it needs to rotate. When I do the calibration program creates my 3 point program I need to change it to use the UFrame that im using?
-
I played with the calibration function:
Created a fixture and linked USER FRAME 1 to it
Created a TEST program using USER FRAME 1 and made a copy (TEST_copy)Using the calibration function:
Step 1 Roboguide generated CAL1 program (USER FRAME 0)
Touched up the calibration program inside the virtual cell
Step 2 saved the program
Just to keep the original program I manually printed the CAL program to MC : CAL1.lsHere you save the CAL1.tp program to a usb, load it in the real robot
touch up the points and copy the program back to the MC directory of the virtual robotSTEP 3
Loaded the calibrated program
Applied the offset to the fixture (or to the robot)
I selected the TEST program for shiftingThe results:
The fixture was moved correctly
Since the USER FRAME 1 is linked to the fixture,
the TEST_copy program was shifted correctly even though it was not selected to apply the offset.
The original TEST program was double shifted because the offset was applied both to the program
and to the USER FRAME 1 indirectlyI created a second program using USER FRAME 2 (not linked to the fixture)
and applied the offset correctlyConclusions
Calibration adjusts the relative position of objects with respect to the robot
It also applies the offset to programs leaving the USER FRAME unaffected
(unless it is attached to the calibrated object)Since the positions are recalculated to work with the virtual USER FRAME,
the real USER Frame difinition MUST matchThe model geometry should be as close as possible to the real object
-
When I do the calibration program it always uses UFrame 0. When I apply the offset i do it to the robot itself as my object/part is a cradle that is in a horizontal position and applying the offset to it would ruin the position when it needs to rotate. When I do the calibration program creates my 3 point program I need to change it to use the UFrame that im using?
The CALibration program is generated using the current active TOOL Frame and USER FRAME 0
When Roboguide re-calculates your Application programs points, it checks what USER FRAME was used
Then it reads the USER FRAME parameters to do the mathThat's why when you transfer the Application program to the real robot, you should also transfer the USER FRAME parameters if different.
Ideally you make the USER FRAMEs equal before any calibration, either changing the virtual frame to match the real or vice versa.