RoboGuide: When the base of a Fanuc robot is rotated, the UserFrame rotates with it... how can I avoid that?

  • Hi,

    I'm using a Fanuc R-2000iA on a track. Parallel with the track I have a User Frame for the product. This User Frame is NOT attached to the robot but to a static object.

    The real robot is mounted on the wagon of the track, rotated 10 degrees around the (vertical) Z-axis. I'm building the same situation in RoboGuide, therefor I rotate the robot 10 degrees on the wagon. Next, I notice that also the User Frame has rotated 10 degrees.

    Why is this happening and how can I avoid this?

    Thanks in advance,
    Emile

  • I think controllers of that era are limited to assuming the track direction is along one of the axis of the robot's base.


    When you setup the rail for the robot in control start, did you tell it is rotated 10 degrees?

    Check out the Fanuc position converter I wrote here! Now open source!

    Check out my example Fanuc Ethernet/IP Explicit Messaging program here!

  • Hi,
    It is an old controller indeed, about 20 years.

    During setup of the rail I could only chose in which direction it works: X, Y, Z but no exact angles. Is this correction possible in modern controllers?
    Worst case, we will have to rotate the robot physically :o/
    Thanks!

  • It is possible to set an arbitrary angle on the newer controllers. I do this on dual rail systems to get the rails both "pointing" in the same direction so mirroring from one side to the other is easy to do.


    In your case however, it sounds like you might have to get out the porta-mag. I would give Fanuc a call though before doing that. Maybe there is some system variable that can be updated to change it.

    Check out the Fanuc position converter I wrote here! Now open source!

    Check out my example Fanuc Ethernet/IP Explicit Messaging program here!

  • The only solution on the older robots is to master J1 off by 10 degrees.


    You would move J1 to the 10 degree position, then Single Axis master only J1 to 0 at that position.


    Of course you need to ensure you won't hit any axis limits, or even better adjust them by 10 degrees to compensate. Then you must update your tool and user frames.

  • Thanks Nation, this morning I spoke someone of Fanuc but not yet the right person... to be continued.
    HawkME: the solution you describe is the cause of the problem I have at this moment ;o) The robot is placed with it's back to the product that must be milled by the robot (as a result of some historical moves). In order 'fix' this issue I mastered the robot as far as possible to the mechanical end-stop. The maximum angle the robot can rotate 'in the desired direction' is 80 degrees. Now I need to compensate for the remaining 10 degrees... which is the problem I'm dealing with. ['the desired direction' is because I want the J1=0 at a that side].


    All in all:

    I think that rotating the robot over 180 degrees can be a solution. The result (of quite some work) is:

    - the robot is facing the product,
    - the J1=0.0 can be mastered in such a way it matches the X-direction of the rail.

    - then, the robot (frame)position corresponds with the RoboDK project and no correction (of 10 degrees) is needed in the controller.

  • Thanks guys. In order to save time I stopped investigating the (im)possibilities of the RJ3iB controller. In the configuration of an extra axis there is no setting for an angle, other than X, Y, Z. I think it's safe to conclude that the controller accepts an orthogonal setup only.

    Today we rotated the robot, mastered joint-1 in such a way that the X-axis is parallel with the rail (so no correction angle is required anymore).

    The RoboDK configuration, RoboGuide configuration and the real robot setup match and it all works.

    Weekend!

  • Ok I'm a very impressed newbie here and have been following and trying to learn as much as possible. Maybe the Jeti's here can help me solve a similar problem (if it's actually a problem.. here it goes... Fanuc r2000 in 125l R-J3ic controller (big one) some say it's a "B"... Anyway Ok we installed the robot on a 60" tall pedestal riser mount that's bolted down with the front of the robots base at a 45 degree angle to our conveyor. Been running it for 6 months in joint mode and learning more. We just hit 90 hours on the robota hour meter now and I'm questioning the mount geometry since we want to implement line tracking and some more things here. ..the problem seems to be that since it was bolted down on the 45 degree angle to the conveyor it seems awkward to be able to get the robot to jog or run with it's x or y axis in a smooth linear path that's in paralelle with the direction of the conveyor. How do we make up for the difference from it's base plate being 45 degrees to the conveyor travel and not 90 degrees like most other installs? This maybe be east or it may be hard but I'm just trying to learn the pros and cons of optimal mounting of robots in the future and not get caught in a corner. The current 45 degree angle to the conveyor and 60" height installation to was chosen to maximize the useful reach of the robot and not really thinking about the 45 degree relationship of the robot to the conveyor as being a problem which I thought could be offset by a user frame ? I'm hoping my "newbieness" it's some thing simple or not a problem at all. Thank you for u any help and input here. John NeySEA.

  • Create a tool frame and user frame.


    Tool frame defines your tool center point.

    User frame defines a coordinate system based on your conveyor.


    Then with those frames active teach your program points. Then all jogging and program will be relative to your conveyor.

  • I'm questioning the mount geometry since we want to implement line tracking and some more things here. ..the problem seems to be that since it was bolted down on the 45 degree angle to the conveyor it seems awkward to be able to get the robot to jog or run with it's x or y axis in a smooth linear path that's in paralelle with the direction of the conveyor.

    So, this is two distinct issues with similar answers.


    First, to do line tracking, creating a Tracking Frame (basically a special type of UFrame) on the conveyor is an absolute requirement. And this is part of the Line Tracking option package from Fanuc -- I assume you purchased it? Basically, a Tracking Frame is taught by touching a point on the conveyor belt, then jogging the conveyor downstream some distance, then touching that point again. That defines the X+ direction of the Tracking Frame, and establishes the encoder/mm ratio. You also touch off a 3rd point on the belt, at 90deg to the X axis, to establish the Y and Z axis directions.


    2nd, for jogging, you need to create a normal UFrame that is aligned with your conveyor. That's under the SETUP>FRAMES menu on the pendant.


    The reason UFrames and Tracking Frames exist is that no matter how good your build, it's essentially impossible to build a conveyor, or pallet holder, or work table, etc, that is perfectly aligned with the robot's UFrame 0. And even if it were physically possible, what happens when someone builds a cell that crams in multiple conveyors/tables/etc, and they are all at different odd angles to the robot and each other? Often that's the only way to fit everything into the space constraints of a factory.


    So UFrames (for stationary tooling) and Tracking Frames (for moving tooling like conveyors) are created to leverage the robot's inherent geometrical flexibility. You can create a number of different UFrames, each one associated to a different physical tooling unit, and give yourself optimized "jogging" for that tooling, depending on which UFrame you have active.


    In your programs, each point you teach is associated to a particular UFrame. If you don't create any UFrames, then the points are all in UFrame 0, the robot's default UFrame. In a program, you can activate different UFrames at will (activating Tracking Frames is different, part of the Line Tracking commands). The best part (for a programmer) of having good UFrames and teaching your points as part of those UFrames is that, if someone moves the robot or the tooling, if you simply re-teach the UFrame (using the same reference points on the tooling as when setting up the original UFrame), your program points will move to match the new physical condition, saving you a lot of work.


    Having a UFrame set up properly also lets you do things like palletizing patterns using mathematical offsets without needing to to trigonometry in your program.

  • Thank you Hawk and Skyefire both for your explanation and help here. Yes I feel much better about things today as a result of asking for help.

    We currently do not have the line software option or encoder hardware but we do have 8 new machines coming which will have Handling tool pro and line following with Fanuc supplied high res encoder hardware. Thanks for saying its a must have option without any doubts. Yes ur explanations and reccomendations give me more confidence and direction to start digging into creating and calling the proper Frames for each routine and hardware config. Current we use a compiled Karel program we wrote loaded on the TP pendant with a small .stm HMI file running on the controller/TP. The Karel routine monitor our sensors using just discrete /booleem logic and calls 3 different TP programs. We have gotten far without any manuals and now since about 2 weeks finally have access to the Fanuc customer resource center portal. Now we have more questions as a result. Seemingly because the old saying right u can only learn so much from a book or self discovery aka trial and error right ! Lol ! Anyway yes thank you very much again. I'm summary now my lack of being able to understand what's the best place to put the origin of any particular frame ? And what is (if any) the tool frames Origin in relation to a yet to be created frame ? Maybe there is no realtionship between tool frames and userframes ? Why are we told to create tool frames tcp first and then userframes ? How does each of those frame origins in 6 possible planes of existence in 3d space benefit each other ? I guess I'm just stuck on setting up a user frame with 3 points and assuming it will work with my tool frame around 5 surfaces of say refrigerator box laying long ways on a stopped and at times moving conveyor ? My software engineer is great at software but hardware and wanting to fully understand what's currently happening and what the years of Fanuc kinematics development bring to the table to help make thing optimal not just nominal, cycle times,safety standards,machine wear and tear, multiple robots in a cell etc. Yikes ! Hope this makes sense.

  • I recommend making a sharp pointer that you can attach (repeatably) to your tool. Teach a tool frame for that first.


    Then with that newly taught tool frame active you can teach your user frame. It isn't possible to teach an accurate user frame without the tool first because you are using the TCP to teach the user frame.


    The robot only knows itself, or rather an internal model if itself. When you teach a tool frame then it now will know where that pointer is relative to itself. Then when using the pointer to teach the user frame it will now know where the conveyor is relative to itself. It all builds together.


    The best place to put your origin (and x+ and y+ points) is where you can repeatably do it again in the future and it lines up straight with your conveyor. I prefer to attach or punch a physical point into the object such as a side wall of the conveyor.


    3 point Tool and 3 point user frame is the standard and is perfect for most applications.

  • HawkME has it correct.


    The robot only "knows" UFrame 0 and UTool 0 inherently. That's b/c UFrame 0 is located at the intersection of Axis 1 and Axis 2, and UTool 0 is located at the center of the Axis 6 mounting flange. Since both of those locations are inherent to the physical construction of the robot, they are a "natural" part of the robot and immutable.


    Note: incorrect axis mastering can have negative effects even on UFrame 0 and UTool 0, so for a new robot "fresh from the crate", one of the very first tasks is to check the mastering before attempting to set up any frames.


    Any user-created UFrame is a 6 Degrees Of Freedom (6DOF) offset from UFrame 0. Likewise, any user-created UTool is a 6DOF offset from UTool 0.


    It is possible to simply set the values of a UFrame or UTool by hand, and sometimes I do this from CAD data. But the vast majority of the time, one teaches a UTool as a pointer, then uses the pointer to teach the various UFrames.


    The relationship between UFrames and UTools comes into play when programming and executing points. If you create a point P[1] on the pendant with UTool 1 and UFrame 5 active, then the XYZWPR values of P[1] define the position of UTool 1 relative to UFrame 5. If you try to run that point with a different UTool and/or UFrame active, the robot will throw a fault.


    Changing the XYZWPR values of the UFrame as part of a program can move P[1] in physical space without changing P[1]'s values. There's all sorts of interesting tricks that can be done this way, but this is the root of how the special Tracking Frame makes Line Tracking possible. The Tracking Frame is a special type of UFrame, taught mostly like a regular UFrame, but the Tracking Frame is "mounted" to the moving conveyor. So points taught in the Tracking Frame (irPickTool provides templates for this that make things simpler) move in physical space to follow the moving conveyor, but the points' mathematical relationship to the Tracking Frame does not change.

  • Hey thank you guys! I'm moving a little slow (fine cnt lol ) over last 4 or 5 days...i had to get my eyes dilated 3 times in that time period. Anyway We're all over your detailed suggestions, reccomendations and approaches (no pun Intended hereon approaches Lol ! ) All really helpful and Interesting. Even the Adam Willea YouTube video series suggestion....yeah Adam has helped us get far to date. I think he is a natural born twcher/instructor/professor right coined Fanuc ambassador. Amazing the depth of knowledge that's available here on the forum and also what it really takes to make the Robots go. I'll keep u folks posted on my progress with this install and the frame builds. Thanka for making me far less scared !! John

  • Btw to ur reccomendations Skyefired Hawk Me regarding the Mastering being paramount and proper before creating/ teaching any frames... we have decided to go back and take the suit back off the Robot and remaster all 6 joints. We did it when we first got it and like ur saying out of the box it was actually off a but mainly on J2 and J3. J1 was minimal as the were the other remaining joint.

    Q. Do u guys think if we re considering changing the grease already does changing a Fanuc robots grease effect the mastering in your experience ?

    Other Question regarding mastering... There is a Fanuc of Japan manufacturing test and joint measurement calibration set of values specific to our machine. Are these values ever really needed for mastering the robot or they intended for something else like an entire joint or balancer replacement? Basically so we need to add or subtract theae values to our existing witness marks/stickers on the robots joints ? Do people ever make or a check the mastering tp program with a fixture(s) to check which may be easier to check the master status vs taking the suit off, climbing up and down these machines and stopping the cell etc. Or is it not worth it? Thx J

    J

  • I've never had to remaster after greasing, if done properly.


    I've only ever remastered if something was wrong or if a motor was replaced.


    A good way to check mastering is to put a pointer on the robot and see if it can follow a straight line. I do that by jogging it in only one direction and see if the motion is curved. It's easier to do that if you teach a user frame on a straight edge and then jog up and down that edge. You can also do it in world but then you need to move your straight edge to align to world.

  • we have decided to go back and take the suit back off the Robot and remaster all 6 joints.

    ...what suit?


    I would not remaster the axes before first checking whether the Mastering has gone bad. It's very rare for Mastering to drift without mechanical damage or some sort of violent electrical event, or replacing motors.

    Do people ever make or a check the mastering tp program with a fixture(s) to check which may be easier to check the master status vs taking the suit off, climbing up and down these machines and stopping the cell etc. Or is it not worth it? Thx J

    It's pretty standard practice to teach a "master reference" program that drives all the axes to their 0 positions so that you can eyeball-check the reference marks on each axis. Since it sounds like your robot is less accessible, then adding a "quick check" program isn't a bad idea: have a pointer that is permanently part of the end effector, or can be re-mounted consistently, and teach some key points in the robot's normal working reach where the pointer touches some precise point on the fixturing -- a cross-scribe mark, a small rivet head, something precise. More than one, at disparate robot poses, is best. And @HawkME's idea about having a scribed line to test how linearly the robot can move that pointer is also a good idea -- Linear motions are the "canary in the coal mine" for Mastering issues.


    Note: to do any of this, you need that pointer to be set up as a TCP, and it's worth it to spend the time to tune that TCP in really tightly. But since TCPs are never perfect, your straight-line test should not involve any TCP rotation, but keep a fixed WPR orientation. That should isolate any minor TCP errors from your test results.


    Obviously, all these tests should be set up when you know the Mastering is good. One thing about robots is that they can keep working to a decent degree even when the Mastering is slightly off -- line-tracking would fail, but hand-taught points would still move to the same points in space. I've seen people who only used Joint moves keep a robot working productively even with very bad Mastering. Of course, as soon as the robot was properly Mastered, all the moves taught to the bad Mastering had to be re-done. :icon_rolleyes:

  • Cool...thank you again for the detailed responses here... really extremelly helpful and pragmatic We're doing every bit of u folks suggested for the tool frame, user frame and eventually line following frame. The suit we got is a robo suit by robo world and it's made out of black Hypalon. Gary from Robo World was great to work with. We applied bees wax on all of the suites zippers and nylon slide surfaces on the shoulder k joint prior to putting the suit on the robot. Please We also did a careful re-master with the robot with help from Fanuc support over the phone here. We called them for support because it seemed like the Robot's J2 and J3 were not cycling fully in there strokes. That's when I learned about the unique Fanuc relationship between J2 and J3...anyway after 15 -20 minutes of doing what the hardware support tech said the J2/J3 stroke and overall the robots repeatbility and speed and joints all now perform as the Fanuc robot specific book/manual states they should. I've made a tcp tool as u folks suggested and am now going to set up the tool frame and a new user frame that aligns with our conveyor. We don't have the line following software or encoder here yet to create the tracking frames but should have it in the spring. I'm starting to grasp the utility and value of the frame way thanks to u guys. J

Advertising from our partners