Posts by HawkME

    If you have the DCS manual, read it.

    If you have RoboGuide, it makes it much easier to set up the zones. You can import a CAD file of your work cell then create and drag the zone boundaries and test them, then export to your robot and test in the real world.

    The two main components of DCS are Joint Position Check (JPC) and Cartesian Position check (CPC). JPC is basically a safety rated axis limit where you set the min and max degrees. CPC you can create box or polygon shaped zones. With CPC you create a zone and choose a safe side, inside or outside, then you choose what has to stay in/out of the zone. You can create a model of your EOAT using boxes, spheres and cylinder elements and use the built in model of the robot arm.

    For example, on my last project I used a CPC box zone, safe side in (diagonal in) on the inside of the cell fence plus a buffer distance, and must have the robot and EOAT stay inside that box so it cannot crash into the fence. Then I created a CPC, safe side out (diagonal out), box shaped zone around a conveyor. The EOAT is allowed to come close enough to the conveyor to touch the parts, but not accidentally crash into the conveyor, which is just for machine safeguarding.

    You can also monitor robot speed and safe inputs.

    One thing to keep in mind when creating zones, is that there is a limited amount of complexity that you can have to keep the processing time fast. It will automatically calculate this and warn you if you go over. JPC takes very little processing time, CPC sphere and cylinder elements take a small amount, CPC Box and polygon shapes take a larger amount. I had 4-5 box elements, several cylinder and sphere elements, and a couple JPC limits and used up around half of the available processing time allowed. It probably won't be an issue, but if it is, try replacing box elements with spheres or cylinders or try replacing CPC checks with a JPC check.

    Are you always starting the same program with the remote start? For remote start you have a method to select the program then start it. the selection methods are: PNS, RSR, Style, or Other. I always use other. Then I use UOP signals to start.

    With the Other method you define 1 program that will always be called by remote start. In this example the program is named "MAIN" and I use UI[18] - (production start) to start the robot, NOT UI[6] - (start). Using UI[18] with this method guarantees that your "MAIN" program will always start from line 1, whereas UI[6] will start the program from whatever line the cursor is currently on.

    Program Select:
    Program Select Mode: Other
    >F3-Detail> $shell_wrk.$cust_name : MAIN
    Production Start Method: UOP

    Enable UI Signals: True
    Start for Continue only: True (UI[6] becomes a resume button not a start button, must now use UI[18] to start)
    CSTOPI for Abort: True
    Abort all programs by CSTOPI: True
    PROD_START depend on PNSTROBE: False

    Check your UOP mapping to see if it is still valid and has the correct Rack/Slot assignment.

    Also look at System>Config>UOP Auto Assignment. I always set this none, otherwise if you have unmapped UOP signals it will automatically map them for you which is good for testing but in my experience can cause issues if you have an "abnormal" UOP mapping. This may not be your issue, but something to look into. You can also access it through the system variables:

    $IO_AUTO_UOP=0 (disable auto UOP assignment)
    $IO_AUTO_CFG=0 (disable auto DI/DO assignment)

    I don't believe there is a way to "record" a path, so you will need to record points and use the correct motion commands available to achieve the path you want. You may be able to just record 2 points and use a motion command to move between them, or you may have to record some intermediate points, depending on the situation.

    It sounds like you are jogging in Cartesian coordinates and rotating the tool orientation about the User Frame Z axis. In this case, you will need to use a Linear move to keep the tool tip in place. But depending on how much you are rotating you may put the J6 axis into a different turn count. The turn count can be determined by viewing the configuration string on the position screen. It will look like "NUT000" or something similar. The last number is the turn count for J6.

    Jog to your starting position and check the config, then jog to the end point and check the config. Does your config string change? If the turn count changes then you will need to add the "Wjnt" motion modifier to the end of your motion statement. An "L" move by default will ignore the turn count and take the shortest path. The Wjnt (wrist joint) modifier will force it to use the turn counts.

    If this does not solve your issue, then please post a screen shot of the starting and ending position screens so I can understand what you are trying to do.

    Are you sure that it is going to the correct position when you manually move to PR[100] by pressing 'move to' on the PR screen? If so what is the active UF & UT when you do that? Are these numbers the same that you have set for UFRAME_NUM= ;
    & UTOOL_NUM= ;

    In the Home macro is the problem that it goes to the wrong position, or is it getting to the correct position, but taking a different path that you want?

    When you use any PR with Cartesian representation, you absolutely must define both the UF & UT prior to using it for a motion command. I always do this at the top of the program in which they are used, or if I am switching back and forth between frames, I will place it right before the motion command.

    Here are the commands to set the UF & UT.

    :  UFRAME_NUM=0;
    :  UTOOL_NUM=1;

    You will need to determine the proper UF & UT for that position.

    Also, if you place it before the other Cartesian PR motion's it will change their positions as well. It will leave the active frame set as you define it, so you need to look through the entire auto program sequence to see if it will cause any unintended consequences.

    When you backup the files, you will get everything including servo & mastering variables. Instead of loading all the files, take a backup of your robot then just load the needed files to the other robots. The files you need will depend on what you have done, you will have to determine this. Here are some files you will probably want, but there may be more.

    1. all tp programs (*.tp)
    2. IO configuration (
    3. numeric registers (NUMREG.vr)
    4. Position registers (POSREG.vr)

    You can see a list of all the files with descriptions by going to the robot webserver and looking at Variable files and TP program files. You can also use FTP to send and receive files to all of your robots.

    Alright, when you can, please check if PR[100] is in joint or Cartesian representation.

    If it is in Cartesian rep, then the PR will go to different locations depending on what the active UFrame and UTool are set to. I do not see anywhere in the program where this is defined. Since you stated that it moved to the correct position manually, this would make sense as the cause of your issue.

    If your PR is set to Joint representation then it will not matter what the active UFrame or UTool is, it will always go the same location. In this case, there is a different issue.

    It looks like the robot takes alternate routes depending on different situations, but eventually ends up at PR[100]. Please find out the following:

    1. Can you post the positional data for PR[100], I would like to know if it is Joint or Cartesian representation?
    2. When you go to the Position Register Screen, highlight PR[100] and press "F2 move to", does it go to the correct location?
    3. When you run the Home macro, does the program get to the line 41 where the move to PR[100] occurs, or is it getting hung up somewhere?

    Ok, I see what you are saying.

    Are you moving in joint or Cartesian? What type of path is the TCP making.

    Since you are jogging with the J6 button, either the TCP should be stationary while the arm moves or joint 6 is the only axis moving.

    Sent from my VS985 4G using Tapatalk

    To see if a PR has Cartesian or Joint representation, just highlight the PR and press Position. If you see X,Y,Z,W,P, R it is Cartesian, if you see J1,J2,J3 ... it is in joint.

    To convert the coordinate representation of a PR:
    1. Set the active UF & UT that you want to PR to be converted in (Very important step)
    2. highlight the PR you want to convert
    3. press Position
    4. press [Repre]
    5. select either joint or Cartesian


    Also When i simply jog the robot in user frame by pressing shift + J6 my tool tip moves from point A to point B.
    how to write a line of code in TPP such that the above happens? I just want to replace the above jogging motion in program.

    Are you saying you don't know how to record a point?

    Some situations you want to only use a PR in a certain UF & UT, in which case you set the active frames, but in other situations you want them to be applied to any frame.

    For example, if you are creating a pick and place program and you always want to approach the part straight on 10 mm away. You could then use a PR for a Tool Offset of 10mm in Z direction. Then you can re-use that PR offset for all pick and place operations in multiple programs and any UF or UT setting.

    Fanuc did it this way on purpose so that PR's can be re-used easily. Think of it as a global vs. local variable in computer programming terminology.

    There is not an exact way to do what you are saying, but you do have some options.

    1. Convert your PR from Cartesian to Joint representation. It will no longer be relative to a user frame but will be locked to an exact position.
    2. Set a point equal to your PR in each program you want to use it. See code below:

    :  P[1] = PR[1] ;
    :  Uframe_num = 1 ;
    :  Utool_num = 1 ;
    :J P[1] 100% FINE ;

    The benefit of option 2 is that your point will always stay relative to a specific user frame and you can use that PR in multiple programs, you just don't use it directly in a motion statement.

    There is a timing chart in the manual, but it does not have actual times, it just shows when signals need to overlap, etc. Depending on scan times and RPI setting in the PLC, the actual timings could vary from system to system. The timings I have are fairly conservative, you could shorten them if there was a need to, but you would need to test it on your specific system to make sure it still works consistently.

    I used a dynamically created UFRAME to pick parts out of tray that was found using a vision process. The vision process application and offset frame must be set to 0 for this to work correctly. Sample code below creates UF 7 dynamically, and uses PR[9] as a temporary position register to manipulate the data. I set PR[9,4] and PR[9,5] to zero, these are the w & p rotations, which must be set in the program because your 2D vision process is only going to give you X, Y, Z, and R. (Z is set as a fixed height in the 2D vision process). Instead of hard coding p & r values like I did, you should get them from your existing User Frame.

    You have a couple options.
    1. Apply the offset to each point
    2. Set an offset Condition at the beginning of the subroutine. You still have to add an offset command on each point but it is defined in one place.
    3. Use the voffset combined with your user frame, to dynamically generate a new user frame. Then all of your points will be relative and not need an offset at all. This is a bit more complex, but can work very well if done right.

    Sent from my VS985 4G using Tapatalk