Posts by factoryrat
-
-
-
-
-
Lost mastering... Robot at a known position. Can I remaster to this position and be able to run my programs?
I took a Fanuc Robot (R30iA, M-70iC, Ver7.3) to a known purge position. I turned off the Controller to move an EE cable. When I powered up the Robot I found I had bad batteries and ended up with a BZAL Alarm on all 6 axis. I replaced the batteries; cleared the BZAL alarm and SERVO 62 alarm after powering back up. When I scrolled through the Purge program and look at the position of where the Robot was when I lost my mastering I see the following information:
X=-4552.8 mm W=-175.42 deg
Y= 1660.0 mm P= 28.94 deg
Z= -1881.3 mm R= 2.13 deg
All the mastering alignment marks on the wrist are gone… worn off. So mastering to the marks would be impossible. The Robot is still, physically at this purge position, can I use the teach pendant info to re-master at this position.
If my memory serves me correctly I have seen this done before. I can’t remember all the steps involved. Can anyone help me? -
Thanks BigFrank612 for the clear and concise answer. The best part is I understand what you said. Thanks again.
-
I forgot, I also believe the 7th axis mastering had been changed in the past couple years and never documented. So I recorded, documented, the value that showed up in robot system variables.
-
We build 4 different model cars and after doing the above-mentioned process I was able to run all 4 models with no touch-ups needed. Dayshift production built cars all day with no problems. I believe if the 7th axis batteries were dead the robot would have been mastered to a new position and all the programs would be off. We have many, many robots in our plant. I have seen similar situations before, but in those cases a robot expert is always called in to resolve the problem. On this occasion I did not have that luxury and was forced to try to solve the problem on my own. I am just wondering what exactly happened and how it can be prevented (if that is possible). Did I really lose mastering? Just trying to learn what I don’t understand.
-
Can anyone help me understand why did this happen and how to avoid it in the future? I am a maintenance Electrician at one of the Auto Companies.
We have Fanuc R30iA Controllers (Ver 7.3) with M-710iC Robots. I had to change a cable that had some broken wires in it. We Run 3DL Multi-View Vision and HEM Flange seal with these robots. So I move the robot to a low position so I could work on it and then I turn controller power “OFF”. During the process of changing the cable (EOAT4CBL-EOAT JBOX) I think we actually moved the robot a little bit.
When I powered Robot back up I was greeted with a “pulse coder mismatch” alarm. I went to Master/Cal menu and I reset “RES_PCA”. I could than jog robot. When I tried to run the “Mov_Home” macro I got a “MOTN_49” alarm (Attempt to move w/o calibration). So I went back to Master/Cal screen and I did the “calibration” with no noticeable results. Then I got the same alarm when I tried to run “Mov_Home” macro.
After some time, much thought and in desperation, and using this website for education and information; I went into the robot Variables. Inside our Robot Controller is a Data Sheet and it had the “pulse counts” for the Mastered position. So I called up the robot screen that showed the numbers stored in controller and compared them to the Data Sheet master count numbers. All were identical except axis 7 (the X-rail). So after more thought and no one to consult I knew I had to try something - so I made variable “$Master_Done” TRUE and I made “$REF_DONE” TRUE. Then I went back to the Calibration screen and did the Calibration and this time it showed robot position on Teach Pendant screen. From my memory I knew this was good. I think I cycled power and then was able to run “Mov_Home” macro.
Thank goodness for this website, because I used it to get enough info to successfully get Robot running and ready for Day Shift production. The feeling of accomplishment was exhilarating and the earned respect of my peers was great - THANKS again, to all who contribute to robot forum.
-
Thanks again - I agree with you and will continue to make the argument of taking the time, getting the proper expert help, and doing and maintaining this stuff correctly. Hopefully if we take these steps it will help our quality and repeatability.
-
Thank you HawkME for your clarifying and enlightening answer! You opened my eyes and I have a better understanding of what User Frames are and there purpose.
In response to, “How did you calibrate the vision system?” It was calibrated by Fanuc when installed about 6 years ago. We have permanent calibration grids mounted in the Cells. I have recently run the calibration check and some of the checks fail. I believe, over time, robot mounted cameras and lasers get bumped and slightly moved, running two shifts, different people interacting with robots, stuff happens. When I ask if this should be part of our PM, be checked, and tuned-in on a scheduled basis I am told we are running 6 different model cars and there are Degrade paths too and it would be a massive effort (“nightmare” I think was the word) to maintain it all. As long as we are running and spitting out cars (numbers, numbers, numbers) management is happy. If you mention working a weekend on this kind of stuff - you will get laughed out of the office with a size 12 stuck up you know where.
-
As you can see the car’s Deck lid is not a perfect, flat, horizontal plane. When I taught the 3 Point User Frame I tried to touch the center of the back edge of Deck lid for my Origin (It is physically the highest point on the trunk). For +X I tried to touch the center, front edge, of the Deck lid (slightly lower than the Origin) and for +Y (my lowest taught point) I tried to touch the driver’s side of the rear of the trunk. Why I picked these three points was not scientifically calculated - it just seemed the easiest for me. If I remember correctly, after I finished teaching my User Frame, when I jogged the robot it moved properly but I think because my +Y point was lower than the Origin and the +X point my plane was slightly skewed. When I moved in a –Y direction the robot moves slightly in a +Z instead of a perfect horizontal line. My question is should I just teach a User Frame that is in a perfect horizontal plane? Could I touch the highest point for the Origin and +X and +Y are then taught at the same +Z dimension? We are using this User Frame in conjunction with our Fanuc 3D Multi View software to get offsets so we can HEM Flange seal the edges of the Deck Lid.
As you can see I do not fully understand what I am doing (but I am trying). I think we use the User Frame as a reference and we teach our points to that job. After that we reference this original job to get our offsets for every job after. I looked at the Fanuc Manual example and it does not tell me a lot. Any comments or suggestions are welcomed.
-
The attachment depicts the 3 points I chose to use for teaching my new UserFrame for the Deck Lid. I do not know if I should have taught the +Y from the Origin or if the way I did it is acceptable. Any comments appreciated.
Also, I tried typing the new UserFrame values directly into the Position Register - is this the proper way to do it? And I chose to save as Cartesian. Does this sound proper?
-
We use Fanuc 3D Multi View vision to locate a car’s door, hood, and deck lid panels as it sits on a skid. We do the HEM-Flange seal on these panels. It has been hard to get a consistently good HEM on the Deck lid. We did a 3 view find on the rear surface of the trunk. Sometimes the quality of the HEM up along the back window of the Deck lid was of poor quality.
It was decided the Deck lid is in two planes, a horizontal plane and a vertical plane. We were only doing our vision on the vertical plane and perhaps this was why the offsets for the horizontal plane were off and we were getting poor quality HEMs.
So - I am trying to teach a User Frame for the horizontal plane of the Deck lid. I have a 10” pointer I screw into the Robot End Effector. I tried the 3 point method and when I finished I found when I jogged the Robot in the newly taught USER frame the Robot moved in the opposite directions that it did before. If I moved in a +X direction the Robot moved in a –X direction. If I moved in what should be a –Y direction the Robot moved in a +Y direction. The User is set up for +X moves to the front of the vehicle and +Y is in the direction of the Drivers side of the car.
Also when I finish teaching the new User Frame correctly(I hope that happens) - how do I put it in a Position Register? That is how we Call our User Frames. User Frame 1 = PR[129]. How do I get my newly taught User Frame into the Position Register?
Thanks in advance. -
Hello,
I have PCMCIA slots on the Fanuc System R-J3iB Robot Controllers. I was wondering if there is an adapter card or device that would allow me to make my backups to a USB drive. I also have some newer R-30iA controllers that allow backing up directly to thumb drives. I see many adapters advertised on the Web but do not know the one that would be best for Fanuc Robots (if this would even work?). Any information and your thoughts would be appreciated.https://www.robot-forum.com/robotforum/Smi…um/shield11.gif