So Ive been having some issues with calibrating my robots. I have to do alot of touch ups to my Roboguide created programs. My method of calibrating is using the 3 step method on the calibration tab, then I apply the offset to the actual robot due to having multiple parts/fixtures that get swapped in and out. I get pretty close and sometimes i run this method a few times to get it close as i can. Then Ill redo my Uframe in roboguide after the calibration offset settings have been applied to the robot. Im thinking this is where I might be missing something. The real robots already have Uframe set up. So I have used the same Uframe number in roboguide and is the real robot (Uframe_NUM =1) and this has my program points way off (pic 1). Then I have copied the Uframe settings from the real robot into my roboguide uframe settings (pic2) this gets my calibration closer but still not accurate. What methods are you guys using and what am i missing in my approach, or doing wrong?
Posts by ORDEP81
-
-
Ive tried that as well and same thing
-
Have been working with this cell for months, all of a sudden i got to open it as well as backups and it hangs up and crashes while 'Creating or recreating Virtual Robot Simulator'. As its trying to reserialize the robot in the cell. If i close out of this then I get an error message as shown in the pics. Any help with this? I have sent all the info to Fanuc Roboguide support, I always get a very generic answer back and usually no follow up after they send their initial 'help' email. All my other cells open up no problem.
-
No the robot has not been remastered. You mean the accuracy between virtual and real world? If so that is what i was talking about 1-2mm but sometimes some points are 5mm off. If not the real world on its own never has any issues as our robots are new.
-
Im using WeldPRo so im pretty anal about getting my points close and most are within 1-2mm but also get points that are way off like 5mm. THis is after calibrations.
-
Im curious as to how accurate your points are in Roboguide compared to real world? Even after calibration, how close do you get and do you still have to touch up your program when its taken to the real world?
-
To be accurate, why not just copy the frame coordinates from the robot to the virtual robot?
Highlight the frame on the bot, hit <Details>, and mark down your XYZYPRWill doing this affect the calibration between the virtual and real world robots?
-
So my real world robot has 4 different Uframes setup all using the 3 point method. I have set up my virtual robot with only 2 user frames using the same 3 point method and points as in the real world. Should my orgin point coordinates match real world and virtual?
-
What is the file that I need import the real world weld procedures/schedules in weldpro?
-
Ok so figure out my issue, but have not solved it yet. So when my object was calibrated it moved it from my rotation 0 axis. Issue is i do not know how to currently fix this to where it will rotate from my calibrated axis. Attached are the object at 0deg and second at 180deg. The red triad is where my object is currently sitting at and the orange square is my zero point as to where its all rotating from.
-
Yes those are good.
The instructor was just using user frames to match up the virtual and real world robots and it worked great on another robot. We never used the 3 step calibration tool in roboguide. Not sure what going on now. I have been trying different things and multiple recalibrations last few weeks.
-
Just a thought, does my UFRAME need to redone after calibration? So I set my object back to original position pre-calibration. Ran a 6 point test program using UFRAME (3-point method set up in roboguide and on physical robot), and none of the points lined up. Moved my object back to its calibration coordinates, and checked my UFRAME reference point and its off. Would this make a difference and should i redo my UFRAME(3points )?
-
-
So as far as the real world robot everything works great and has been working fine for months. Its just trying to get Roboguide programs transfered onto it is when the issue come into play. The virtual programs do not line up with the real world.
-
The real robot does use coordinated motion but is not set up on in roboguide
-
Hi
What kind of "pretty close" dimensions are we talking ? Is you fixture 2 meters long or 5 mm long ?
My first thought "Do i have a really good TCP" ? That is critical, all you real world dimensions are going to be based on your TCP and problems when you rotate are typical from bad TCPSo on the 'normal' side after calibration im within 1-2mm out in real world. But once the table flips 180, im up 1/2" off. The fixture is 6' x 5'.
Now the TCP, was set up by our instructor as he was setting up our cells while teaching, and for some reason there was an offset to which i do not recall his reasoning to this.
-
So im running calibration on my fixture object in my workcell. I run my 3 points touch up in real world and object get offset correction in roboguide. I create a 6 point (3points with fixture table flat and another 3 points on the other side 180*)test program in roboguide to test how close it comes to real world, and its usually pretty close. Issue comes when the table is turned 180* my points seem to shift. Im not sure what I could be doing wrong? Any insights, help, or suggestions?
-
-
After calibrating the real world robot with the virtual robot, I get the prompt to allow the offset of my fixture and then i get the prompt to offset my programs as well. I keep getting an error and my programs do not adjust. Is there a way to adjust the programs somehow?
-
Ive been playing with the draw features function in Weldpro and other than it generating a quick program for only that feature I dont see much use for this. Please correct me, there has to be much better uses that im not seeing here. How can I use this be help me be more productive?