Posts by TomFoolious

    Normally I do all my programming using Roboguide. Then I dump them in and make small handshake changes, IO changes, whatever on the pendant. I can't stress enough: Memorizing the shortcuts is the key to efficient programming. For example, Menu, 6 (using the TP numpad) gets you to Setup...Menu, 0, 6 gets you to system...Function, 1 aborts all, Function, 0, 6 unsimulates all IO...When editing a program F5, 3, F2 will start your selection tool, then Item ### to where you want to select to (if its far and you know about the number) then F2 again to Copy...F5, 1, #, enter will input lines.


    I think you get what I mean...memorizing keystrokes has made me a much more efficient programmer. Also people get a false sense of "This guy knows what he's friggin' doin", which is funny.


    Oh and the same goes for on Roboguide as well. When you have the Virtual TP up, right click on one of the lower grey portions alongside the keypad section and choose "Show Key Map", this will then show you what PC keyboard buttons relate to what Virtual TP hardkey/softkey. Some are not shown like Alt+5 will bring up the Position screen, Alt+6 brings up IO, Alt+7 brings up the Status screen.


    Once again: Learning keystrokes makes programming a FANUC robot much more efficient and enjoyable(?).

    If I had to guess, the keys should probably be the same as Roboguide.


    The TP runs its own windows OS. The membrane keys are just sending standard QWERTY button presses to windows, and then it’s remapped for use within the software that is responsible for displaying the TP interface.

    Ah yeah. I forgot about that. A tech I was working with once showed me the windows screen. It's probably the logitech keyboard then...Any idea what version of windows they are running? I wonder if I can look at keyboards compatible with that version for better results?

    Three things, in this order, are needed to start a PNS routine:

    1.) 8-bit pattern defining which PNS job you want to run

    2.) PNS Start

    3.) Production Start


    So yes if you tie those UI signals to an input through configuration then you should be able to trigger the robot to run the PNS.


    And as Skooter said: UI:6 is for restarting the robot from a PAUSED state. I believe it is only active too when the robot is in REMOTE.

    You're very limited with interacting with the Virtual Teach Pendant (VTP) in Roboguide. No there is no way of changing the way keys are maintained or momentarily pressed. You can use a keyboard with Roboguide's Teach Pendant though. Along the very top of the VTP there is a button that has a tiny keyboard inside it - press that and it will remove the yellow outline of the VTP screen and allow you to fully use a keyboard. If you need to see what buttons do what on the VTP then, right-click on the VTP near the TP Enable toggle switch and choose "Show Key Map". This will show you MOST of the keyboard keys that are mapped to the VTP buttons. Now I did say MOST, so the remainder of the buttons are like POSN, I/O, STATUS, etc which if you hover over them you'll see they can be pressed by using Alt+ (a number).


    One thing though that is annoying using Keyboard + VTP: Sometimes when clicking around or using F8 for the menu, the VTP will drop keyboard function and you need to re-enable it by clicking the tiny keyboard button. If you ever have a yellow outline around the VTP screen - you will have some keyboard support until you re-enable.


    Sorry this doesn't answer your question entirely, but hopefully it helps you and some others...

    Hey all,


    I plugged in and started using my wireless keyboard which is a logitech K360. It works fine, but I'm looking for recommendations from anyone who is using one regularly and am happy with it. I do a bit of copying from Roboguide to the Pendant (rather than loading .SV files and the such since I'm ususally working without an actual robot backup). My keyboard is fine, but works weird. Mainly that I cannot use the Numpad numbers, but the numpad symbols are OK to use. Also I'm not sure if the keyboard is supposed to work on the physical TP as it does on the Virtual TP in Roboguide, once again mainly that when I press ESC, it triggers the PREV button on the Physical TP. Makes it kind of silly to use a keyboard if you still need to press some of the buttons on the TP. F1-F12 work as they do in Roboguide which is why I'm confused if this is just how keyboard to Physical TP works or if it is the compatibility of my Keyboard to the Physical TP.


    Thanks for any help anyone can/may provide!

    TomFoolious

    Could it be due to the natural curvature of the welding wire compared to a nominal TCP?

    Nah, I have a contact tip from each manufacturer/torch I've come in contact with (great pun, eh?) that has a pointed dowel pin placed into it. I don't rely on weld wire for TCP's anymore - always use the "teach tip" I have made.


    Thanks for the suggestion though. The project is complete and awaiting shipping (whenever COVID goes away at the customer's facility I guess). Having an accurate TCP was all I needed.

    FIgured it out...Under Run Configuration I was using a job depending on user selected...when I changed that to No UOP Control this means that an external device is issuing what the UOP will do. When I changed to No UOP Control, the first three UOP In signals went off. Now I think I can continue debugging.


    I also managed to get FANUC Ladder III setup with Roboguide...Roboguide to KepServer to talk to FactoryTalk HMI software. Friggin' awesome :) Now I can do all my offline debugging I need for my next project!

    That's fine, I am used to seeing our PLC guys use the signals/methods you described anyways...just from a knowledge/learning aspect I didn't understand why they are consistently on. From the little i know about PLC/ladder logic - if power goes from relay to coil...the coil goes on, if no power then coil is off.

    Learning how to PMC...I have some example code from another job. I've attached FANUC Ladder III to Roboguide, so I can experiment and watch signals and understand the code. New to PMC...newish to ladder logic (experienced enough to troubleshoot hold-ups).


    So I'm not understanding how the IMSTP and even the SFSPD are being kept ON? There is no power going to those signals? I'm trying to digest the information in the PMC Assignment manual too, but that might be doing more harm than good thanks to the translation.


    I think the answer is that the IMSTP G000.0 address is always being written to by the PMC to be ON? So therefore, unless the UOP is assigned to external I/O it'll always be ON? Is this correct? OR is this some kind of Roboguide bug (with having FANUC Ladder attached)?


    Thanks to any one who stop by and provides help, ahead of time!


    Taking a shot in the dark here, but if I were the robot programmer for this I'd be asking the PLC programmer to use Explicit Messaging to write the value they want to move the axis to into a Register...They could use Explicit Messaging as well to read a Register that the MCH_ANG can be written to as well, if the PLC programmer needs it in their code to do what they need to do.


    I'd then write a program that takes that Register value from the PLC, input it into a PR (using PR[i, j]) and then executes the motion.

    I have a cell I'm working with right now that is based around WeldPRO...but I cannot import a robot that has LR Handling Tool into it, but if I could change the backup to HandlingTool, then I could import it. Maybe just an oversight or restrictions with the software, don't know. Would be nice to be able to mix and mingle whatever we need.


    EDIT: Oh and not a real major want/need, but more for the "cool" factor - VR support like ABB's software has.

    Well, how are you performing the 6-point TCP for the laser? It's very hard to create a non-physical TCP using this process accurately. There's a lot of non-obvious errors that can creep in.


    When I've had to do this in the past, I usually did my physical TCP as precisely as possible, then copied that TCP to my non-physical TCP as a starting point. Then I would program the physical TCP to a fine point on my tooling, and begin adjusting the non-physical TCP's XYZ values (not the RPY, unless really necessary) until the non-physical TCP lands on the same mark, moving to the same point with the non-physical TCP active.


    It also helps to use anti-backlash techniques to while doing this -- always back the TCP off a healthy distance (10mm or so) and come back at the target from the same direction every time. If you "hunt" back and forth over the target, you'll be injecting the entirety of the robot's backlash issues into the TCP accuracy.

    I'm going to have to read over your comment a few times to understand your method of setting up the TCP as described, but I did want to add to your point about the backlash - that is something I've definitely noticed and have accounted for. I've found there's at least a 2mm change in the laser distance when changing directions.


    *I've read over your comment again before posting everything above and below and I think I understand what you are saying now...Program my Torch TCP (at the wire stickout of 15mm for example), then copy that value over to the Laser TCP T.Frame, program the physical TCP at a known point, then adjust the Laser TCP to land at the same spot the Torch TCP is landing at. This helps to line them up...That makes a lot of sense (if I understood it correctly).


    It's a fine method and I'll give this a shot today or early next week...The customers for this job are first time robot users, so I'm trying to keep everything as simple and to the point as possible for their first dive...If that method works better than the method I used below, then I'll include the method in the instruction manual.


    Note: This is my first digestion of your comment in the early morning here and I've only had a few sips of coffee so far! I'll read over this later today or this weekend to get a better understanding I'm sure.


    The 6 point method has to be done a particular way to be accurate in w,p,r. Most people I see do it wrong and end up with the same result as a 3 point TCP or incorrect w,p,r angles. A laser sensor would not be easy, but Skyfire's method sounds like a great way to do it.

    So i've found out! Ha. I ended up using position popup to set WPR to perfect 45° angles for Approach 2 and Approach 3. I think that definitely helped dial in the TCP. Before I would just throw the robot somewhere left front from Approach 1 and right front from Approach 1. Apparently I was not creating an accurate TCP. I was taught a while back that the more crazy data you give the robot the more accurate the TCP will be - I have learned that is NOT true!!


    Anyways I'll revisit if need be here, but I want to thank SkyeFire and HawkME very much for your time and responses. I very much appreciate you two taking your time to help all of us here on the FANUC Robot Forum! Stay SAFE and stay HEALTHY!

    Hm. Your X search direction, I'm assuming that's in UFrame, not UTool, correct?


    The easy, obvious answer, is that there's something off in the relationship between the TCPs. If you activate the laser TCP and rotate around UFrame Z, does the dot stay in place, or does it drift around circularly? If you rotate the tool 90deg around Z, then try the search again with that orientation, does the error remain consistent, or does it change?

    SkyeFire,


    It seems like Robot 1 came in to place with the search spun around on Z 90°. Robot 2 is better but still off by a mill and a half to two. For some reason my brain isn't allowing me to comprehend what this means for my situation. That I do have X error between my two TCP's, but less error in my Y?


    Are you sure it's not that 7 degree Y rotation? Have you tried with that set to 0?

    Yes I am sure. On Welder A we are running the same parts that have the same burr, so the 7° rotation on Y is needed on that cell as well. Welder A has been working flawlessly. I haven't been taking enough notes, but I believe I have tried a perpendicular search for Search 1 and Search 2 already due to geometry of one of the model's weld seams. I can't accurately say whether or not the error repeated. I will do a search now with the tool being perpendicular and see what I get.


    I did wonder what would happen if I put Welder A's TCP values in to Welder B, but after comparing the two - their values are fairly close to one another. I guess that's why I'm slightly biased against it being a TCP issue.


    EDIT: Update for HawkME: I did a -X and -Z search with the laser perpendicular to the part and my error in X is present on Robot 1 again as well as Robot 2. The amount of error on Robot 2 is ridiculous. The laser shines in the corner where it should be and the laser is all the way at the TOP of the ANGLE (this has been consistent).


    EDIT2: I've put the cell and attached the torch/laser cad model to the robot. Going to get perfect world TCP's and put them in the robot. Not sure what it'll prove, but interested to see if perfect TCP's based upon CAD models will change anything.


    EDIT3: After placing the Roboguide CAD Model values into R1 and R2's Torch and Laser TCP's I can happily say Robot 2 seems to have fallen in line. Robot 1 is still having an issue in X though. It looks like this issue is related to TCP after all. I figured it was, but after multiple times reteaching TCP's and checking TCP's seeing them rotating around well I didn't think it was TCP related. I will figure out why the Six Point XZ/XY methods I was using wasn't giving an accurate TCP check and write a well documented work instruction for the customer.

    Is the laser sensor perpendicular to the edge drop off?


    Can you apply a correction factor and it is consistently good?

    The edge that drops off sometimes has a burr, so when I'm doing my -X check I have the laser sat back -7° in P (Y Rotation). When I do my Z- check, the laser is perpendicular to the flat surface it is checking (because that's when Y, W, P and R are recorded to the Search PR).


    I have applied a correction factor and it seems to be consistently good. Not that you're implying it, but I could leave that in there and run with it, but as we both know it's not the right way of doing things.

    Hm. Your X search direction, I'm assuming that's in UFrame, not UTool, correct?


    The easy, obvious answer, is that there's something off in the relationship between the TCPs. If you activate the laser TCP and rotate around UFrame Z, does the dot stay in place, or does it drift around circularly? If you rotate the tool 90deg around Z, then try the search again with that orientation, does the error remain consistent, or does it change?

    Yeah so, with how simple this welding system is, I didn't bother using any User Frames. The robots ship on the platform that all the welding is being done on, so no need to bring anything back into place after installation. Everything being done is in UFrame Zero, or World. I have UTool 1 = Torch and UTool 2 = Laser.


    I thought that the TCP's being off was the easy answer too, but I have retaught and rechecked TCP's both individually between laser touch senses and then teaching both and trying another laser touch sense and the error is always there. But once again only on Welder B. Welder A is working flawlessly between both robots.


    I have checked rotation around my TCP's multiple times and they are as accurate as I can make them. Rotating around Z does give some circular error, but it's about the same as Welder A. This last time I taught the laser's I used a 2-4-6 block, leveled it with the robot's world frame and then shined the laser down the X, then the Y side, so that the laser shines down the side of the block as best as possible. This way I got rid of any X and Y leaning when doing my Six Point XZ TCP. At least I believe that's what I did.


    You did bring up a good point of trying to rotate my tool 90° around Z and trying it that way to see if the error repeats. I do like that idea and will give it a shot when I can.

    Hey everyone,


    I have a Keyence il-300 laser hooked up sidecar to a welding torch. We're looking for Robot Input 1 (or 9 in Robot 2's case) to come ON when it is in the GO Range. TCP's are set for the torch and the laser. The issue I"m seeing is the laser is finding the seam fine, but the Torch TCP is not going to that spot. It's consistently off by a couple mm's on both of my robots...On my duplicate cell, the lasers and torches work flawlessly. I cannot find any major difference between these two and have run out of ideas of what to look for. I've rechecked/retaught TCP's, mastering, checked reflection (turned lights in the cell off).


    The weld seam looks something like a V, but one side of the V is straight, so more like this... I/ if you're looking at Robot 1 or \I if you're looking at Robot 2's. I'm doing a -X to look for the straight vertical drop. Then I'm going to a flat surface behind the angle and doing a -Z. This way I can get positional data of where the Seam is and send the torch TCP there. Customer wants to rely on the use of the laser to find the weld seam, so switching to a V-Groove pattern and doing a 1_D search isn't going to fly.


    So when I take the Laser TCP to the saved positional data (lets say PR[8]), the laser shines right at the inside edge of the vertical drop and at the bottom of the angle, where it should be. But when I switch to Torch TCP and take it to the same PR[8], the wire is usually off in -X from where the laser is.


    I cannot wrap my head around WHY or WHAT is causing this. Especially since Welder A is working fine and I've taught the lasers and torches the same way (using Six Point (XY)).


    Any ideas out there?


    Edit: Forgot to include that I also thoguht it was a scanning time/speed issue, so I turned the search speed down to .5mm/sec and was still not able to bring Torch TCP to Laser TCP/Laser found seam....but the difference between Torch and Laser always seem to be consistent. I can't think of what that is actually telling me.

    Not sure if it'll help, but I'll post my two ways I've gone in the past. Both of these also use the "breadcrumb" trail method (i.e. set Register 1 = a number after every move, so you know where the robot left off at last):


    For simple Material Handling, use PR for your pick/place position and then use other PR's for Offsetting that position in your job. That way your end customer only needs to touch up 1 point in the job. With this method, in your Home program (I usually use a PNS job for Homing sequences) you can then use the offsets from the job from the pick/place position and the robot will home using the same programmed path. And if the pick/place PR is ever touched up, then your path (and also your homing path) is automatically updated.


    The more complicated route I've had to go with complicated weld fixture. The weld fixture had different sized parts, but the paths were kept the same between all of the parts: Program your welding/paths with P[x] (points). At the beginning of your weld job, record each P[x] to a PR. If any changes are made to the weld path they have now been updated. When/If the robot stops during welding, you can check your "breadcrumb" trail to see where you last left off, and then go to your Home job. Your home job has the positional data of the model you're running and is able to home the robot. The only downside this method is that it is a PR HOG! It was my first attempt at a Homing sequence as well as with a complicated weld fixture.

    As with everyone else, my best experiences has been with .STL. The only downside is lower quality (so no edges, corners, etc. to pick up on) and no colors. Colors isn't a BIG deal, but if you have complex tooling it's nice to have those colors in there to differentiate between clamps, bodies, brackets, etc.

    You can use the "polygon reduction tool" located in the roboguide folder, this tool can reduce the size of the file significantly.

    Huh, I talked to a guy from one of my company's sister companies and he mentioned this tool, but he had no idea where it came from...Now I have a lead to where I may be able to find it! Thanks Lootr, I'll be lookin' for it. I would like to use a more detailed CAD format if possible (again).