Posts by demarkg

    I just got a call from one of our plant technicians, and they said that they have a robot that has lost its core software. Apparently, they were making some hardware repairs that may have caused this. I am not sure.

    I do have a copy of the original core software on the original USB drive sent with the robot.

    What are the steps to restore this core software? After the core software is restored what should I expect next? Will I also need an image and/or All of above file?


    Quick Mastering Question:

    Let’s say I have just set my Quick Mastering Reference Position with the robot joint angles all at their typical zero positions. Then later, if I get a BZAL alarm, I may wish to perform a Quick Master. When I perform the Quick Master, do I need to have the robot in the Reference Position? If that is the case, then how is that an advantage over Zero Mastering?

    Also, I believe that the Reference Position can be set to a convenient robot position other than zero. In this case, does the Quick Master need to be performed at the non-zero Reference Position?


    Yes, that makes sense to me. --> if you don't specify, then it will use the currently active user frame.

    That is what I observed when I test it in Roboguide. Thanks for confirming.

    I also noticed that initially, the resulting position was not what I expected (the sum of x,y,z,w,p,r elements). The reason was that the system variable $OFFSET_CART was set to True. Things started working as I expected after changing that value to False.

    from Fanuc manual:

    If $OFFSET_CART is TRUE, offsets for Cartesian positions are treated as frames and used to
    pre-multiply positions. If this is FALSE, offsets for XYZQPR positions are added field by field


    I have a question regarding offsets, specifically the offset condition instruction. I prefer to use the offset condition instruction when I want to apply the same offset to multiple motion instruction statements. However, when I was reviewing the Fanuc manual, it mentioned that you could specify a u-frame as an option at the end of the instruction (See attached snip). I do not understand how specifying the u-frame would work or change the related offset motions. It was also mentioned that if you do not specify a user frame, then the world frame is assumed. Can anyone offer any insight to clarify this option a bit? Thanks

    FYI: I have $OFFSET_CART=False

    Here is a strange issue I am having with Roboguide. My mouse cursor vanishes (turns into a hard-to-see cross-hair) when traveling across the background of the work cell. The cursor appears again when traveling over the robot, fixtures, or parts. Any ideas on how to keep the cursor visible at all times?

    - Also if anyone knows why the virtual robot does not move smoothly when I run the program in continuous test mode? Seems like I am in step mode even though I am not.


    Does anyone know how to turn off the display of the program motion lines in Roboguide? The line seems to be showing the path between the TCP and the next point. I would like to be able to hide that line. Thanks

    We never did figure out the root cause. I ended up getting a new laptop for now. Roboguide is working fine on the new laptop. I am guessing there was some conflict with the other controls software I had installed.

    Sadly I have re installed several times. The downloaded files should be OK as those same files work for my colleague. My IT guy is at his wits end and is ready to get me a new laptop.

    I recently purchased a license of Roboguide. I have installed the software as admin. I can open the example work cell and manipulate it with no issues. However, when I create a new work cell the software just hangs up on the creation step. When I force the close I will get the error message shown below. This seems to be an issue with my laptop as my colleague can install and run the software with no issues even though he has the same model laptop. Any ideas to help me resolve the issue. Thanks

    Looks like there are several different responses to this question. This is why I asked the question in the first place. When I get back to my shop I am going to try and confirm the answer by trial and error. I feel like the rotations should be about the reference tool frame. This would allow the TCP to rotate about the TCP without changing the X,Y,Z coordinates. An interesting and basic question for sure.

    So (firatgzl) are you saying they are always measured with respect to the world frame regardless of the specific user frame the point was taught in? Just trying to reconcile your response with (SkyeFire). Thanks

    I am trying to gain a better understanding of the orientation parameters. Rotation about x (Rx), Rotation about y (Ry), and Rotation about z (Rz). Are these rotation angles measured with respect to the current tool frame? Or are they measured with respect to the world/user frame? Thanks

    I am curious if there is a more standard (common) way to implement safety devices within a robot cell. For example, if I have an E-stop, door gate interlock, and a light curtain, I could wire them directly to the terminal block (TBOP13) on the robot controller circuit board. I have the following controller: R-30iB Plus, cabinet A. However, what If I would like to use a safety controller such as the Banner XS26-2DE. In that case, I would prefer to use the master safety output relays from the safety controller to tell the robot safe/or not safe. So my initial question is an attempt to understand if it is typical (or possible) to use a safety controller to govern the robot controller? thanks

    I have a basic question about jog frames and their intended purpose. I have created both tool and user frames in the past and I understand how they are applied and why they are valuable. I am not clear on when you would create a jog frame. For example, if I just created UF(8) then I can simply set the current user frame to UF(8) and the robot will jog aligned with that user frame. So when would I need to create a jog frame?


    So to be clear. If I move my user frame (UF3), will the PRs taught in that UF3 move with the frame, or do they stay relative to the world frame?

    Just want to check my understanding of position registers (PR). So I am thinking of PRs as a sort of global position variable that is not specific to any single TP program. So I can use a PR in one or more TP programs, unlike a standard position that is local to the TP program it was created in.

    Hope that is correct so far.

    The part I am less sure of is in regard to how a PR is used within a custom user frame (UF). So let's say that I just created UF(3) and I then set the active frame to UF(3) and then I create a few new PRs. Will the PRs be tied to the UF(3) or are PRs always tied to the default world coordinate system? Typically when you teach a basic point you need to pay attention to the active user frame, is this also true for PRs?

    Thanks very much

    I know this topic has been discussed in other threads. However, I am still not 100% clear.

    I have attached an image showing my tooling setup including the calculations for Inertia ( I am using Creo in this case)

    Note: in my image the Robot TCP is the center of the orange circle plate (labeled x, y, z in the image).

    I want to make sure I am using the correct values for inertia as highlighted in the red boxes.

    I understand that I need to convert the units from mm^2 to cm^2. Also, I apparently need to divide everything by 980 to convert from kg to kgf.

    So my results would be:

    Ix: 0.909 kgf*cm^2

    Iy: 0.953 kgf*cm^2

    Iz: 0.084 kgf*cm^2

    What do you think? Does this look close?