Posts by ps0f0r


    today I finally try to do some tests and I have a little problem. In the options of SKIP instruction is not the $MISC[]... choice. Do I have to turn it on somewhere?

    Thank you.


    In the options of Skip instruction you select Parameter option, you highlight the inputwith the 3 dots you press enter and you write the parameter MISC[1].$HPD_TRQ[x]<or>y where x is the axis you want to monitor and y is the amount of disturbance that the skip condition will return true

    What you are saying is to jog the robot to the coordinates of the tool frame or the user frame? IF the coordinates relate to a position in space IE a pin. Than switch to joint mode and jog the robot to the tip of the 'point'. Than take the values from the joint cord system and calculate the differences than jog the robot to the Vernier marks and re-master. But where would I input the differences that I calculated. I'm gonna send you a PM maybe you could call me to explain in detail if you wouldn't mind :)

    Go to your frame screen, Enter the frame you want and write down the Direct Entry coordinates. Now create a program and inside declare a point anywhere, doesn't matter but ake sure you use the correct UTOOL and User Frame 0 (UF0), and change its coords to the frame coords. Move there. The robot should now be close to an identifiable reference point. A point someone (the integrator or a programmer in the past) used as the frame reference point.

    Open the point's position data and change the representation to joint. You should now see the joint angles of each axis for this position. Write them down.

    Now switch your jog mode to JOINT and try to correct the position of each axis separately so that the robot tool is aligned and touching the reference point. You should be carefull as to not correct one axis too much but better spread the fault difference by applying small corrections to every axis.

    Now when you are done and you have aligned the robot with the frame reference point, declare another point in your program. You should now have 2 points: point 1 is the original point from the Frame Coords used as direct entry and point 2 is the modified version of it in order to touch the reference point.

    Write down the JOINT angles of each axis for point 2 and calculate the difference. You should come up with a result of minor degree difference for some axes or all of them negative or positive, the difference margin should not be large. If it is large something has gone wrong.

    Now, put your robot axis to ZERO position and using the results from your previous calculation apply them to each axis. Again, you should check to see that after applying the correction to each axis the marks are not way off of where they were supposed to be, it should be +-2 max 3 degrees. If the Vernier mark is visually way off then something has gone wrong. When you are done applying the corrections, master your robot with zero mastering, Calibrate and you are done!

    Repeat the procedure by moving to point 1 (the frame ZERO point, not the corrected one) and verify that the robot now correctly aligns with the frame reference point by using the direct entry values.

    Things that can go wrong:
    Identifying and using a reference point that is not a good reference point. (worn out, or with excess tolerance due to wear and tear or age).
    The frame may (and most propably) has been set when the system was brand new and before any tolerances to the system's parts were introduced due to normal aging and wear. The same goes for the mechanical parts of the robot. Machines age, and as they age new tolerances are introduced and we copensate our programs to correct any faults due to aging.

    If you come by any of these possibilities, remaster to zero as before using the marks and your eyes and just reteach the user frame. The tool frame does not need to be taught, you can use the direct entry values.

    ps0f0r The pick and placement position is off after the remaster. We did a Zero Master. But since we did the remaster we have to reteach the user and tool frames because those values are entered directly, they aren't taught positions.

    Yes I guessed so. For the sake of it, verify mastering by putting robot on zero. Also keep in mind it is not a good idea to use Direct entry for User Frames. User Frames should always be taught imo.

    There is also another trick you can do that is a high-risk high-reward move and it can produce great results but it is a bit more advanced. Using the same tool frame as before and without teaching it, I would try to hit the zero (reference) position of the frame. If the directly entered frame points to a solid reference position in space (a pin, an edge point etc), move to that position and then correct the JOINT positions of each axis separately until the robot is correctly aligned with the frame reference point. Write down the difference. Lets say you had to correct +1.3 degrees on J1, -0.7 on J2 and +1.3 on J4 in order to hit the reference position of your frame. Put your robot to zero. Apply the corrections to the specified axes and re-master them to zero.

    Are the frames taught or directly entered?

    By repeatable do you mean that each cycle the robot position changes or that the placement position seems off after the recent remaster?

    How did you master the robot? Zero mastering? quick master?

    I bet you just mean that your positions are a bit off compared to before the mastering. Unfortunately we cannot exactly hit the Zero marks with the eye. Every re-mastering using the vernier signs will produce slightly different programs.

    Put your robot to Zero position and verify that all Vernier marks are aligned in every axis and then reteach your User and Tool frames. This is about as close as you can get to your old configuration when using Zero mastering.

    If by not repeatable you mean that each cycle the same position is shifted then this is a different story that suggests a mechanical problem on the robot or complex logic used inside the program that is not accounted for. (a change in a PR, a calculated PR, a wrong offset application etc etc)

    Roboguide has different saved points from different dates. Did you check those saved points for the possibility of reverting your roboguide cell back in time?

    The photos suggest you use an R30iB+ controller, if so have you set up your Auto Backup feature on the real robot? how many copies have you declared ? If you go to FILE > Auto Backup there you can check maximum number of versions the controller keeps before overwriting the back up with dates. Maybe you are able to restore the tp program you modified using a previous backup.

    If you dont have any of the above options then I guess this is a good lesson to start using them, I am talking especially for the auto-backup feature. I have it set to 3 loadable versions and the backup to start automatically every day at 09:00, this way I can revert any bad changes up to 3 days old.

    Option 3, inside roboguide from the project explorer expand robot > Tooling > double click the EOAT you use and find the "Calibration" Tab. Have you calibrated your UTOOL to much the roboguide tooling? Do a Calibration by following the steps mentioned in there.

    You can get the manual from FANUC CRC portal. This is the one you are looking for:

    FANUC Robotics SYSTEM R-J3iB Controller DeviceNet Setup and Operations Manual


    Chances are you are doing something wrong. You do not need to use Frame Offset on the robot to recieve the copy of the program, only on the orginator and that is if positional data are not already recorded in respect to an already declared User Frame.
    Select case:
    1) Original program has positions recorded in a User Frame. Copy program to the desired robot making sure that the robot to recieve the program has the same User Frame taught in the same position in the list. if the program is using UF1, UF1 being the center of something, a table, a fixture, a shaft then you need to make sure that the robot to recieve that copy of the program has the same User Frame taught in the same position on the list, that being UF1_originator = UF1_reciever. Not coordinate-wise but space wise, do not copy the values of one UF to the the other, but teach it using the same reference position.
    2) Original program has positions not recorded in a User Frame. Make (teach) a User Frame on the originator Robot. Use the Coordinate shift function under UTILITIES > Frame Offset. Convert the UF of the program to the newly created UF.
    All these actions are done on the orginator robot.
    Now go to the reciever robot. Teach the same User Frame in the same position in the list, copy your program, done.-

    The first thing I noticed from your attached photo is that you have not declared the label segment you want to jump into…. In fact you have declared no LBL sections at all and I strongly believe this is the reason why you can’t move and modify that segment of the command. Try declaring a LBL[5] further down your program with a dummy instruction like R[10] = 0. Now the TP should let you navigate to the LBL portion of your skip command.

    You can always install a trial version of roboguide, create an exact copy of your existing robot, and make some tests in there like:

    Create a routine-like program with the skip condition and the high-speed skip instruction using the virtual teach pendant, what happens?

    Try to upload that program to your controller, what happens? Are you able to modify the label?

    Write the program in roboguide’s text editor then compile and load to the controller. What happens?

    There is no such thing as a "Dynamic User Frame" as far as FANUC terminology stands. The label "dynamic" is just a label used by the integrator to give the user frame a meaning. There are many ways to make a User Frame "dynamic" (=a user frame that changes based on parameters and conditions defined by the programmer) but all this is dependant on the application. You should talk to the integrator if you wish to know why he labeled this frame as dynamic, chances are the frame is used inside a program that manipulates it (the frame) based on conditions met, fixture used or workpiece to be welded, and there should also be an automated procedure (a tp program or a Karel program maybe?) that modifies it based on parameters or a search function?

    All these are wild guesses. You should talk to your integrator anyway. As I can figure out there are a lot of info left unclarified about your robots by them.

    It worked good when I thaught the same UF, i have noticed that the integrators left few robots with UF that have almost the same coordinates that means that i don't have to teach extra UF right? I mean is it important that the UF is related with the item?

    Yes. As I already stated on my previous post it is very important that the User Frames are taught. Because teaching accounts for small tolerances of mechanical part assemblies. Almost is not same. The "almost" part is related to the difference in tolerances I explained before.

    There is no need to teach any extra UFs. But it is very important that you verify the UF of the robot(s) that will recieve a copy of the program you wish to run.

    Thx for the detailed explanation. I have here 10 robots and each one is doing different models of a crane arm. The thing is the frames are thaught from the integrators so I'll have to copy the points coordinates in order to teach it to another robot 🤔 or maybe make a new frame

    No you don't. Take the robot with the taught frame for a spin. Get the robot to hit the FRAME's origin. To do that, go to SETUP > FRAMES press F3 OTHER choose user frames, go inside the frame you are interested and check the positions used by the integrator to teach the frame by using the MOVE TO command. By doing so you get a glance of which reference point the integrator used on the fixture. Another way would be to create a program, select the user frame, create a point anywhere and change its coords to zero. Move to that point and you got yourself a USER FRAME origin. USE SMALL SPEED OVERRIDES !!!!

    So what i did is i copied the UF and UT from the source robot and transfered it to the new one via direct entry. Than i used offset UF TF to transfer the points into the new UF and TF. It worker but still the points are quite far from where they should be. So something is missing...

    What's missing is that you used direct entry of the User frame without actually teaching the frame to the robot. What basically a user frame does is translate a point in space in relation to the origin of the User frame. I for example have an application that uses grinding wheels to grind different kinds of pieces with the robots. I have 10 robots all in their own cells but many times I want to have the flexibility of moving the grinding process of a workpiece from onr robot to another.

    This is where User frames come into play. So by using a global tool (a tool that I can install on all my robots) I teach every robot a user frame that is basically the center of the shaft the grinding wheel is added onto. Now when using this User frame, all of the positions of the grinding program are saved as a translation of the distance the robot tool is away from the center of the grinding wheel shaft (the user frame). The key here is teaching the User Frame and not COPY the user frame. Because, even though all cells are identical, there are factors like cell assembly tolerances, robot assembly tollerances, grinding wheel position tolerances that will not be accounted for unless I actually TEACH every robot the center of their working shaft. You do not need to use any offset utility when copying a program this way, you just need to make sure that the UFRAME and the UTOOL used in the original program are the same on both robots. To clarify, if the original program was taught on a robot using UFRAME 1 (= the origin of the table the workpiece is put at) with UTOOL 1 being a welding torch, then you must make sure that on the robot that is going to recieve the copy of the program, UFRAME 1 must be the origin of the table the workpiece is going to be put, and UTOOL 1 must be its torch and both of them should be TAUGHT. You must also make sure that the reference used in teaching the User Frames, better be the same on both fixtures inside the cells. Use a DIN pin located on the same position on both fixtures.

    Also something worth adding here is that even if the original program to be copied did not take advantage of any user frames you can still create (teach) a user frame and convert the positions using UFRAME offset utility or by touching up each position in the program with the new user frame selected. The procedure is: choose the user frame the points are saved with, visit the point, change User frame with SHIFT+COORD press touchup, on the popup that shows up change the old user frame number the position is stored with the new user frame you created, press enter, visit the next point, and so on and so forth. Believe it or not, I use the second method because I dont really trust automated shift functions for a reason I dont know yet. 8)


    I am using fanuc robot, Model-R2000i 165 F

    I want to lock the override speed at 95% in auto mode.

    1) Go to SYSTEM > Config. There you can set option "Allow chg.ovrd. in AUTO mode" to FALSE
    2) Go to SYSTEM > Config. Another option there is "Signal to set in AUTO mode" which allows you to activate a DO signal when robot starts in AUTO mode. Using this signal, you can create a background logic program that turns $GENOV_ENB system variable to false when the DO you defined is on.

    The more professional way to do this would be using the "Override Select function" from Menu > SETUP. There you can define a pair of DIs and based on their state, the system practically disables the override buttons and sets the override speed defined in the setup. Then, you manipulate override signals from your PLC (ex. set DIs to true when pressing "CYCLE START" command)

    Press MENU > SYSTEM > Variables. There you will find a list of all the system variables. They are sorted alphabetically. find $MISC_MSTR press enter and inside there set hpd_enb to true.

    I doubt high-speed skip will help you on this kind of application. Is the workpiece flat or contoured? does the suction path involve tool orientation change?

    you can try this strategy:
    First set $MISC_MSTR.$hpd_enb to TRUE and make a cold start.

    slowly move the robot TCP towards your workpiece monitoring at the same time axis disturbance screen. Split your screen in 2 windows. Continue moving towards your workpiece until you identify which axis has the highest disturbance. Let's say its axis 5 and the disturbance just before a collision detection is 3.5.

    Then you can make a routine like this:

    J P[34] 100% CNT10 (clearance before piece)


    L P[8] 10mm/sec CNT100 (approach

    L P[30] 1mm/sec FINE Skip,LBL[5],PR[18:workpiece touch]=LPOS
    PR[18:workpiece touch,3]=PR[18:workpiece touch,3]+5
    L PR[18:workpiece touch] 5mm/sec FINE

    *Put here logic that will handle the case where the robot doesnt find a workpiece during the search routine

    *cleaning path*

    If your nozzle normally touches the workpiece at for example Z= 30, modify P[30] z coord to 10, robot will try to slowly reach 10 and it will stop when the disturbance on the axis you defined (in the example axis 5) will reach the skip condition limit and save that position to PR[18]. Then it will increase Z coord of PR[18] by 5 mm and move to that position. From there, you can create a cleaning path with multiple ways:

    You could create your path as an offset to PR[18] or by using incremental moves.

    You can use PR[18] as a frame and program your cleaning path to that frame. Every cycle the robot will recalculate the User Frame and modify the cleaning path.
    To define the mentioned PR as a working user frame you can follow this example:

    You need to give some more details concerning the application. What type of application is it? is it a pick and place? machine tending? arc welding? What is the robot model and which controller ?

    There are 2 functions that can help you with what you are trying to achieve. One is touch sensing and the other one is touch skip function. Touch skip usually comes together with high sensitivity collision detection function.

    If you have neither of these, then you can use High-speed skip instruction which is standard on all robots with a skip condition that monitors disturbance of the axis (or axes depending on the kind of move) but this is not really sensitive, that's why you have to give more info on the kind of application you are working with.

    You can save some hassle and look for a global vision system like pickIt which is very cost effective and a plug and play solution.

    From where is "home" pressed? Do you use a PLC ? If you are using a PLC then you first need to go to SYSTEM > config and set Enable UI signals, CSTOPI for abort and Abort all programs by CSTOPI all set to true.

    If you are using a PLC then you should map your UI signals (Cycle stop is the one we are interested at) to a set of DIs. After completing those steps there are a ton of ways to send robot home. You can create an R_HOME macro for example and assign it to a DI bit. The PLC can raise both the CSTOPI signal from the mapped DI and the HOME signal.

    If you don't use a PLC you can map your UI signals to flags using rack 34 I think. Then you can use Background logic to raise the flag associated with CSTOPI and then another Flag to activate r_home macro. Be advised that mapping your UI signals to flags is compromising safety and is not considered good practice. It is not advised at all and it adds additional thought on the design of the system.

    You cannot use BG_LOGIC to call a motion program.

    Ok lets see.

    1) Press MENU > SETUP and choose prog select option. From here you can customize several things concerning how the robot will start and what to check during production (AUTO) mode. The first option is your program select mode, press F4 CHOICE choose OTHER option, enter, press F3 DETAIL and manually modify variable $shell_wrk.$cust_name by pressing CHOICE and choosing PROG_1 TP from the list. You can also configure several safety checks before a program starts such us if the robot is @ home pos before start.

    2) Yes you can. Create your desired function by using Argument Registers for your parameters. Here is a sample function that initializes a PR for me (notice the indirect addressing I use to assign the parameter):










    CALL PR_INIT(55)

    3) This is called standard I/O assignment and it is performed automatically. Depending on the communication protocol you have selected for your project (Profibus, ethernetIP, modbusTCP etc) and provided you have set up the connection properly, you configure the DI/DO based on the rack and slot number of the logical signal with the physical signal (the signal from the connected device). You map the logical signals (DI/DO) to the physical ones. If for example you have the EthernetIP option and you have already configured the connection, you can go to MENU > I/O > Digital press F2 > CONFIG and map the logical signals to their physical locations. Here The rack indicates the kind of I/O module installed on your system and slot indicates numbers of I/O module which compose the rack. For ethernetIP rack is always 89, for ModbusTCP for example rack number is 96, and the slot is 1 because i do not use adittional I/O modules. So DI[1-80] is located on rack 89(EthernetIP) slot 1 starting from physical address 1. DI[81-272] again on rack 89 slot 1 starting from 81 etc etc.

    4) For all the above options you need to have EthernetIP optional function on the robot. EthernetIP includes FTP, Advanced Internet Connectivity and Customization, and Socket Messaging. If you do have it, you need to configure host comm properties through MENU > SETUP > host comm. There you configure the robot's TCP/IP params (be carefull as to configure the port you actually use), IP, subnet, router, pretty easy stuff, After that, you can load your TP programs to your robot (provided it is on the same LAN network your PC or LAPTOP is connected to) using Roboguide or the robot's FTP. You could even use external FTPs to save or backup your robot programs daily, or access and download files and programs from other robots' FTPs provided they are connected to the same network of course.

    EDIT: you also need to Ascii Upload option too as HawkMe says, I didnt mention it as Ascii upload comes as standard in Europe. But then again you also need roboguide too, who writes .TP programs in notepad nowdays ? :D

    concerning the palletizing program, you could automate the function:

    This way you only need to teach 3 positions relating to approach and deposit of the first piece. Robot will calculate everything else.