Posts by TylerRobertson

    If your part geometry changes little, this might be possible without additional software. It's not for the feint of heart - I used to work at Robotmaster, and I have an employee using a post-processor I created to go from MasterCAM to Robotstudio for simulation, and then to milling. Even with my expertise, he's struggling a bit to make it work on a spare 2400 with an IRC5 controller.

    I have a hard time recommending a single person or small shop attempt to get a system like this working for milling for anything outside of prototype, concept, etc.

    What model of controller are you running?

    It's been a long while since I've done this, but somewhere on the controller there is a COORD setup that is dictating how attached UFRAMES are rotated.
    the bad news is it's a pain to get right - the good news is you can definitely do it.

    Basically, the controller tries to calculate where the center of rotation of your table is, based on how the COORD frame is set up. This makes a frame that references the rotation of the table, which then governs how your UFRAME rotates when the table turns. This is usually done very 'roughly' because someone set it up just to coordinate a tool tip while jogging, and not for offline programs.

    You might have to do it in the x-start or whatever it's called system boot.
    I won't do a good job explaining all the steps, what you will want to do is get your hands on the Coordinated Motion option manual as it explains it better.

    Also, you've probably already done this but make absolutely sure your table is actually physically rotating 180 degrees. I've been caught before!
    Setting up an accurate COORD frame will help with this.

    Hours of programming something that may or not work vs. 4-wires max on an $100 sensor, and some basic PLC programming ... unless there are other barriers there (PLC licensing etc.) I would go with the sensor over motor current checking, just in case.

    I am developing a simulation for planning collision-free paths for a multi-robot pick-and-place task and have only digital models of the robots (1 KR16 and 1 KR6) and the plant CAD data at my disposal. The approximation of some points is a requirement because of tight cycle times...

    what are you using to develop the simulation? This could be a bit easier to estimate with the right software

    When it comes to relaying information from a robot programmer or automation engineer dealing with robots, get your model and mark the axes of each direction for whatever prints, photos, displays. After a robot is set up, I like to take the model from simulation or CAD, mark the World/Bass, TCP and Work Object axes and then have a large format print out that is mounted near or on the robot cell - no confusion.

    As for the TCP, I believe it's generally to follow the world axis, and I also believe there is a mathematical reason I believe that ABB points down instead of up (might solve a condition with quaternions) but it's been awhile since I had to calculate that kind of stuff.

    having gone through the same exercises as you when coordinating robots, CNC machines and people ... my advice is to stick to using the robot's general axis directions when possible.
    I have rotated work objects for robots so they matched what people were comfortable with on CNC machines, but then the next person who is jogging the robot is activating/deactivating work objects and world jogging, and inevitably will forgot once and crash something.

    Its much nicer if TCP angles, and work object angles are small values off from their standard directions.

    This is the dumb-guy approach to this - if your X/Y mate when you're milling on one side vs. the other is acceptable and it's your depth of cut that seems off (leaving uncut stock where the two sides should meet up)... which it sounds like because you changed your tool length in CAM to solve it - you could instead adjust your TCP to cut the depth accurately. You can make a number of test cuts and iteratively manually change the TCP to get what you desire.
    This only works if you're always cutting left/right sides of statues because more than likely if you do full-motion 5-axis simultaneous (6 axis) milling, you'll see the TCP off in different directions

    If jogging produces the same problem then the programmed path isn't as suspect (although the points appear pretty close together for the speed). I don't see anything in this that jumps out at me - I've seen in some cases where tool-changing was happening, a tool was activated and either the LOAD_DATA wasn't changed, or there wasn't anything in the LOAD_DATA for that tool.

    Make sure this:
    ; LOAD_TOOL(1)

    is activating the right LOAD_DATA. You can also force it out ... I don't have the syntax in front of me so you'll have to look it up but it was something like


    or if using Tool 2


    and each tool will have to have the specified load data in it.

    When jogging after running a program its possible to still have LOAD_DATA[2] activated while jogging with TOOL1 if changed manual.

    Could you post a sample of your milling file, with the header and approx 20 lines of code? Depending on motion parameter/zone settings, point spacing, speed, direction etc. this can be solved without diving too far into anything else.

    If nothing exists on it - then what I would do is add features to your workcells - they can be as simple as a tooling ball, or even just 3 ground/lathe-cut locating that you can bring the torch-wire too.
    Are you using modular welding fixtures (Bluco etc.)? They make stuff like this a lot easier... otherwise you'll have to mag-drill some holes. The absolute locations of these don't need to be super accurate, but do make them as far apart as possible encompassing your entire work area , fixtures etc.
    In any case, do this for each robot

    Now, teach 3 points on each position. Make sure you clip your wire accurately to your TCP on every robot before you teach the points.
    This is not super-fast but with some help you should be able to do this all in maybe 2 days?

    Then after you move the cells you have the 3 points on each one to reference. Physically move the robots, shim etc. until those points line up again.

    Alternatively you could also teach a work object when you're saving the points - then after you move the robots, teach a new workobject and you can then find the difference and apply that difference to your program work object.

    does this make sense? It's not a small amount of work but it might be faster in the end than re-teaching programs - and then in the future you have those reference points for checking your robot

    If I think I understand what you mean, touching up the points IS essentially creating a new work object, but if you can locate physically on the original points when you move, great.

    Before trying this, you will need to know that your TCP Is the same as the one used when the workobject was defined - if you cant confirm that then tryi to locate on something will be moot. Usual with welding torch it’s set after the wire is clipped so it should still be relatively accurate if your programs are running correctly and your torch isn’t mangled.

    If you’re lucky, someone saved the points they used for the workobject s as “pos” variables in the module file or in a system module so that would be my first place to check, otherwise I’m not sure if/where Abb saves the points used to creat a wobj (maybe someone else can chime in on that)

    Alternatively if you open the Module that contains the workobject in a text editor, you can get the coordinates of the workobject and create a point with the same coords. Then you can make a MoveJ to that position (be careful) and the robotnwill move to the origin of the work object. Again make sure you’re using the same TCP or you could have weird orientations or crash your robot.if you succesfully get an origin that looks to make sense (corner of fixture, target on fixture etc etc) then you can use the Jog screen and the Workobject jog to move along a line and check that you’re following the fixture - from there it should be pretty clear if the workobject was following something in a cell.

    For this reason I would recommend to anyone if they’re designing a workcell
    - common base for robot and fixtures/parts
    - locating features for fixtures
    - engraved/scribed/machined features for visual locating
    - treat robot applications like machining applications and you will thank yourself later!

    If you move the robots and they don't share a common base (not cement, but a flat steel base) with their fixtures, you will probably have to make a new work object - You could save doing this if you make some kind of indication procedure (tooling balls/indicators) to ensure you physically install them in the correct location.
    Otherwise if you already have a workobject, only make a new one at your new facility if your points are off.

    I would also support the robot arms with some kind of dense material (foam/wood) while moving so no large jarring motion will cause issues. Air ride truck recommended.

    Anytime I take the time to have robots packed up with a lot of support I haven't had calibration issues upon power up (assuming those batteries okay.)

    Make sure you have OS disks for each controller on hand just in case to handle the worst.

    Make sure your team that is disconnecting the robots/cabinets knows what they are doing and doesn't damage cables / wires - even if they know what they're doing, a reminder of precautions doesn't hurt. You don't want to be scrambling to try and get an order for old cables for those controllers.

    Also keep in mind where cabinets will go and length of cable runs - we mirrored a room and when the architect planned the drops we didn't have enough cable length for the cabinets.

    Iowan you are absolutely right, it is me that is misremembering - I got thinking there was a different work object definition on the S4 controllers that used 4 points, and I swear I had a memory of doing this ... maybe Andy can enlighten us.

    Iowan is totally right, the X1 is the origin and it is a similar user frame as fanuc.

    I also suggest teaching a TCP regardless - as if you ever have to take your tool off, you can then update your tcp - also your speed/feedrate is calculated at your TCP - so if you are moving around a corner, even if it is flat X/Y your robot wrist might be making a large motion, therefore your speed might not be consistent at the point you are cutting at

    real easy to do in auto mode ... not so much in manual. I think with ABB it was designed that if you're using it in manual mode, that 'Debug > PP to Routine > R_Home' should be simple enough if you know how to do everything else.
    It's a design choice anyway.

    The 'easy' or 'safe' way I chose to do this is have operators run the machine from an interface built as an application with touch buttons on the screen. 'Production', 'Home',

    Multi-tasking could do it, but what you could do is check an input with a 'Trap' Routine (Read about about those in the manual there's a lot there.)

    Trap Routines are pretty straight forward - personally I don't like putting motion in them because the motion executes from where-ever the trap happens so you have to make damn sure the machine can get safely to the position if an error occurs. That said, it's a great way to safely-home a robot if you have a a limited program set.

    The trap routine is triggered by an input - when the input happens the robot program pointer moves to the trap routine, and then moves back to regular program execution. I use it exclusively to stop motion, and display errors.

    Your Trap routine could then look like

    TRAP xxxx
    TPWrite "Press START to GO TO HOME";
    TPReadFK answer,"","START","","","","";

    So when you press the button to home, the robot will stop what it's doing, prompt you to rpess 'start' to home the robot, move to home and then stop program execution.

    If you pressed start after that it will continue in whatever program it left off from.

    You would have to put this trap routine either in a program that was running in a separate task to have it occur any-time (dangerous) or put it only in the programs that you would want to be able to home in. It should also check that the robot is not in auto-mode.

    hey Andrew,

    Just in addition to the manual of things that are a bit different when compared to FANUC

    - the work object has two frames, the object frame, and the user frame. Because I first learned on FANUC, I just use the 'user frame' whenever I can and I keep the object frame undefined. The object frame would be a frame shifted from the user frame.. a frame on a frame if you will; the object frame helps if you have a frame on a table and then another frame on a fixture or multiple fixtures.

    - the 4 points are X1, X2, y1 and Y2 ... you're basically setting the four points on two lines. it's a little odd feeling when you're used to setting an Origin, then X1, Y1
    - The workobject is only active if it's called out in the move. you can name the work objects whatever you want. wobj0 is the robot base, or generally set to the same as your 'world' (not always).

    MoveAbsJ start, v2000, z40, grip3; <- no work object

    ie: MoveL start, v2000, z40, grip3 \WObj:= wobj1; <-work object

    - You can have multiple work objects in a program and the definitions are all usually at the top of a program if you look at it in a text file (also ABB lets you save and load .PRG files easily from and to the controller and view them in text.). You can download Robotstudio and use the text editor for free. (it will highlight and check syntax)

    - work objects that you want stored across programs (ie: the same frame in all programs, and if you update it, it will be referenced by all programs) should be stored in a system file