Posts by HawkME

    The Z height must be set to the distance (in the Z direction) from the offset frame to the top surface of the part where the taught vision pattern is. The Z height setting is critical to the accuracy of your vision process.


    To determine the Z height either directly measure it or you can use a pointer on the robot with an accurate TCP to find it. If you use the pointer method, then simply set the robot UF to the offset frame and UT to the Pointer TCP then jog the pointer to the surface of your part and write down the Z value.


    A good indication that the Z height is wrong is if you move the part (without rotating it) and the robot position does not go to the same position relative to the part.

    Go to Fanuc's website and open the data sheet for your robot. It will show max wrist moment and inertia load. Then you can calculate your payload moment and compare.


    Sent from my VS985 4G using Tapatalk

    UI[8] is an external enable signal from the master controlling device to the Robot. The enable signal needs to be on before the program starts and throughout program execution. In your case it sounds like the PLC is the controlling the UOP, but you would need to confirm this.


    First, I would make sure you are not loosing the connection between the PLC and robot.


    Second, check to see if there is some logic in the PLC that is preventing UI[8] from turning on and staying on. When I use a PLC to control a robot, I keep this signal held on all of the time, but your integrator may have done it differently and require other conditions to be met for UI[8] to be on.

    I am assuming the laser scanner is hardwired to the robot fence stop circuit and is stopping the robot, is this correct?


    Big Frank is right, don't use pause. Pause stops the program but allows the current motion to finish. Instead I would do something like this:


    Wait (laser clear)
    Pulse (reset) 1sec
    Wait 2 sec
    Pulse (start) 0.5 sec
    Monitor (condition)
    End


    Sent from my VS985 4G using Tapatalk

    Welcome to the world of robotics!


    Quote

    Intuition tells me that I should use as little points as possible with CNT100 to make the moves smooth and cut the angles to reach my destination faster. It turned out that I may be wrong on that one.


    Your intuition is correct :applaus:. To keep cycle time low start out with as few points as possible and use smooth CNT100 moves. Then, as needed, reduce the CNT value and add intermediate points to avoid hitting obstacles. The purpose of CNT is to move "continously" through an intermediate point as fast as possible.


    Quote

    What about coordinates? Almost all of the guys I asked told me that teaching the robot by moving it only by JOINT is the fastest since then only one axis turns. Which for me doesn't seem to be the complete truth since there are times that the robot moves faster if the point is tought on WORLD or TOOL.


    The coordinate system you use should be determined by the intended use of that position, not cycle time. It is inaccurate to say the coordinate system affects cycle time :waffen100:. Use Joint coordinates for permanent positions such as a home position that is not relative to specific user frame. Use Cartesian coordinates (user) for positions that are related to a user frame, such as a fixture. The time to travel between 2 identical (same starting and final joint angles on all axes) will not be affected by coordinate system. If it is, then something else is causing the difference.


    You were told that moving 1 axis is fastest.... ok, but what if the robot needs to move in 2 axes to reach the part. Would you make a move with only 1 axis at a time :down:, or move them both at the same time to reach your destination :top:. It is an irrelevant statement. Using joint coordinates will not only move 1 axis, it will move any and all axes it needs to to reach the destination; could be 1, could be all 6.


    Quote

    I'm also curious about points motion types. What's better? Linear or Joint? I was told that I shoud use Joint(excluding situations like gripping and the like) but once more I have my doubts. In some cases linear seems to cut the path pretty effectively in my opinion, but people still had their doubts about it.


    Once again, the motion type you use should depend on the needed function of the robot. To put it in simplified terms, Joint moves are for traveling and linear moves are for doing work. Take a simple pick & place example. You start at home, travel with a joint move to station A. Use a linear move to do the work of picking up the part and retracting safely away from the fixture. Then travel with a joint move to station B. Then use a linear move to do the work of placing the part and safely retracting away. Then travel with a joint move back to home.


    In general, a joint move will be the fastest way to "travel" from A to B. But usually when you are doing "work" the robot needs to move in a straight line with a linear move in order to perform the work properly.


    Quote

    Speaking of points, is it better to teach a point A then a point B and let the robot move back, allowing it to calculate the optimal path by itself and only add points where there is absolutely no chance for the guy to get there without ruining everything around, hence making like three or four points in total with long movements and turns for example? Or is it more effective to teach the robot more points by moving it only by an inch in between each of them?


    Teach point A and B and the robot will take the most optimal path. If you need to add an intermediate point to avoid a collision, then add them as needed, but generally the fewer the better. Teaching points inch by inch is crazy, unless you need to for a crazy complicated path.


    Quote

    One more question regarding points in general. To achieve the highest speed is it always necessary to teach points with 100% for Joint and 2000 for linear? If not, then why?


    In general, yes, though if you are make a small move the robot will not get anywhere near top speed. For example if you are doing: L P[1] 2000 mm/s Fine, and only traveling 1 inch total distance, the robot may only get to 100 mm/s speed before it has to start slowing down to stop at its point. It won't hurt anything to tell it to do 2000 but it will only do what is possible. BTW, some robots will do 4000 mm/s. Also you can use the max speed command if you have the motion package. It looks like this: L P[1] max_speed. It will move as fast as it can while maintaining a straight line, which will probably be a little bit faster than 2000 with your robot.


    I think the moral of the story is that each command for the robot is there for a specific reason. There is not one movement type that is better than the other, but some are better suited for certain functions and you need both. Use the movement type that is proper for the job you need to do.

    That is unfortunate, however you should be able to manually select the program in the Prog Select menu, which is probably the easiest way in your situation.


    There are many options to select a program with remote start, but they are assuming that you have a remote device to select and start the program such as PLC, HMI or even hard wired buttons/switches. The main premise behind remote selection is that you send an integer # to the robot to select a certain program. The integer can be created from a group of Boolean inputs such as hardwired DI's. This could even be done programmatically without any external device or inputs, for example, by setting a register value and using branching logic to call a program. No doubt this is much more elegant if you have a PLC or can manipulate strings.

    If you have the DCS manual, read it.


    If you have RoboGuide, it makes it much easier to set up the zones. You can import a CAD file of your work cell then create and drag the zone boundaries and test them, then export to your robot and test in the real world.


    The two main components of DCS are Joint Position Check (JPC) and Cartesian Position check (CPC). JPC is basically a safety rated axis limit where you set the min and max degrees. CPC you can create box or polygon shaped zones. With CPC you create a zone and choose a safe side, inside or outside, then you choose what has to stay in/out of the zone. You can create a model of your EOAT using boxes, spheres and cylinder elements and use the built in model of the robot arm.


    For example, on my last project I used a CPC box zone, safe side in (diagonal in) on the inside of the cell fence plus a buffer distance, and must have the robot and EOAT stay inside that box so it cannot crash into the fence. Then I created a CPC, safe side out (diagonal out), box shaped zone around a conveyor. The EOAT is allowed to come close enough to the conveyor to touch the parts, but not accidentally crash into the conveyor, which is just for machine safeguarding.


    You can also monitor robot speed and safe inputs.


    One thing to keep in mind when creating zones, is that there is a limited amount of complexity that you can have to keep the processing time fast. It will automatically calculate this and warn you if you go over. JPC takes very little processing time, CPC sphere and cylinder elements take a small amount, CPC Box and polygon shapes take a larger amount. I had 4-5 box elements, several cylinder and sphere elements, and a couple JPC limits and used up around half of the available processing time allowed. It probably won't be an issue, but if it is, try replacing box elements with spheres or cylinders or try replacing CPC checks with a JPC check.

    Are you always starting the same program with the remote start? For remote start you have a method to select the program then start it. the selection methods are: PNS, RSR, Style, or Other. I always use other. Then I use UOP signals to start.


    With the Other method you define 1 program that will always be called by remote start. In this example the program is named "MAIN" and I use UI[18] - (production start) to start the robot, NOT UI[6] - (start). Using UI[18] with this method guarantees that your "MAIN" program will always start from line 1, whereas UI[6] will start the program from whatever line the cursor is currently on.


    Program Select:
    Program Select Mode: Other
    >F3-Detail> $shell_wrk.$cust_name : MAIN
    Production Start Method: UOP


    System>Config:
    Enable UI Signals: True
    Start for Continue only: True (UI[6] becomes a resume button not a start button, must now use UI[18] to start)
    CSTOPI for Abort: True
    Abort all programs by CSTOPI: True
    PROD_START depend on PNSTROBE: False

    Check your UOP mapping to see if it is still valid and has the correct Rack/Slot assignment.


    Also look at System>Config>UOP Auto Assignment. I always set this none, otherwise if you have unmapped UOP signals it will automatically map them for you which is good for testing but in my experience can cause issues if you have an "abnormal" UOP mapping. This may not be your issue, but something to look into. You can also access it through the system variables:


    $IO_AUTO_UOP=0 (disable auto UOP assignment)
    $IO_AUTO_CFG=0 (disable auto DI/DO assignment)

    I don't believe there is a way to "record" a path, so you will need to record points and use the correct motion commands available to achieve the path you want. You may be able to just record 2 points and use a motion command to move between them, or you may have to record some intermediate points, depending on the situation.


    It sounds like you are jogging in Cartesian coordinates and rotating the tool orientation about the User Frame Z axis. In this case, you will need to use a Linear move to keep the tool tip in place. But depending on how much you are rotating you may put the J6 axis into a different turn count. The turn count can be determined by viewing the configuration string on the position screen. It will look like "NUT000" or something similar. The last number is the turn count for J6.


    Jog to your starting position and check the config, then jog to the end point and check the config. Does your config string change? If the turn count changes then you will need to add the "Wjnt" motion modifier to the end of your motion statement. An "L" move by default will ignore the turn count and take the shortest path. The Wjnt (wrist joint) modifier will force it to use the turn counts.


    If this does not solve your issue, then please post a screen shot of the starting and ending position screens so I can understand what you are trying to do.

    Are you sure that it is going to the correct position when you manually move to PR[100] by pressing 'move to' on the PR screen? If so what is the active UF & UT when you do that? Are these numbers the same that you have set for UFRAME_NUM= ;
    & UTOOL_NUM= ;


    In the Home macro is the problem that it goes to the wrong position, or is it getting to the correct position, but taking a different path that you want?

    When you use any PR with Cartesian representation, you absolutely must define both the UF & UT prior to using it for a motion command. I always do this at the top of the program in which they are used, or if I am switching back and forth between frames, I will place it right before the motion command.


    Here are the commands to set the UF & UT.

    Code
    : UFRAME_NUM=0;
    : UTOOL_NUM=1;


    You will need to determine the proper UF & UT for that position.


    Also, if you place it before the other Cartesian PR motion's it will change their positions as well. It will leave the active frame set as you define it, so you need to look through the entire auto program sequence to see if it will cause any unintended consequences.

    When you backup the files, you will get everything including servo & mastering variables. Instead of loading all the files, take a backup of your robot then just load the needed files to the other robots. The files you need will depend on what you have done, you will have to determine this. Here are some files you will probably want, but there may be more.


    1. all tp programs (*.tp)
    2. IO configuration (DIOCFGSV.io)
    3. numeric registers (NUMREG.vr)
    4. Position registers (POSREG.vr)



    You can see a list of all the files with descriptions by going to the robot webserver and looking at Variable files and TP program files. You can also use FTP to send and receive files to all of your robots.