1. Home
    1. Dashboard
    2. Search
  2. Forum
    1. Unresolved Threads
    2. Members
      1. Recent Activities
      2. Users Online
      3. Team Members
      4. Search Members
      5. Trophys
  3. Articles
  4. Blog
  5. Videos
  6. Jobs
  7. Shop
    1. Orders
  • Login or register
  • Search
Everywhere
  • Everywhere
  • Articles
  • Pages
  • Forum
  • Blog Articles
  • Products
  • More Options
  1. Robotforum - Support and discussion community for industrial robots and cobots
  2. Members
  3. TygerDawg

Posts by TygerDawg

  • Offline programming without calibrating

    • TygerDawg
    • April 26, 2025 at 4:10 AM

    It's not clear (to me) what you are asking / seeking. When you mention "calibration", do you mean use of those complicated expensive tracker devices? I've never used them. I've never known anyone who could afford them.

    I've done many high-precision / high-tolerance robot apps with OLP. Almost any OLP software will support this method. The workflow goes like this:

    1. Plan your programming with extensive employment of User Frames or coordinate systems in order to calculate path locations relative to the Frames. These Frames need to be mathematically attached to fixtures, workpieces, whatever makes sense.
      • The goal is to teach (or calculate) your path points relative to a Frame. Your path locations will be calculated (in programming) relative coordinate transformations.
      • Then go out to the work cell and teach those Frame locations to "calibrate" the physical work cell to the CAD or OLP work cell. For most applications, "close" is close enough.
      • Frames can be aligned with corners of tables & fixtures, or on any three "touchable & repeatable" arbitrary points on parts, components, anything. Frames needn't be orthogonally aligned with anything in the physical work cell, even arbitrarily skewed Frames based upon physical points work just as well.
    2. Design your CAD or OLP work cell "close to" the physical cell.
      • Make approximate measurements to the User Frame(s) from the robot's origin planes. I used a tape measure for this, hoping for ±100 mm location accuracy.
      • Construct the CAD or OLP cell using this location data
    3. Build your OLP simulation model and make all path points relative to the appropriate User Frame(s)
    4. Prepare your physical work cell / program.
      • include one or more Frame location programming calculations using the three common Frame location variables (typically: Origin, X-Axis, Y-Axis direction)
      • teach those Frame calculation locations as location variables. I used a fabricated pointer tool with appropriate pointer tool transform invoked for the purpose
      • program path points will all be relative to the User Frame(s), so this relative operator must be part of the motion statement
      • this requires (back to Step 3) that your Frames and Frame locations all be "touchable & repeatable" with the pointer tool
    5. test, adjust, and run your program using those User Frames

    This method can be easily adaptable to dynamic work cell changes. All that is required is to re-teach three Frame locations as needed. With smart design, a pointer tool can be easily integrated into existing tools. I have even designed and constructed specialized "Frame teaching appliances" that allowed me to quickly, easily, and accurately re-teach a Frame. I no longer have the eyes for such high tolerance work, but in my day I was able to hold path accuracy to ±0.010 mm on my OLP CAD→Path applications.

  • userPage manual

    • TygerDawg
    • April 12, 2025 at 4:17 PM

    Response may be too late for your needs, but I'll post anyway.

    Here are three sample menu programs from many years ago:

    • a generic template program to build a teach pendant menu
    • a sample program for teach pendant menu
    • a (modified / redacted) program data file showing the string variables & formatting used for the menus

    This was developed long ago for the older teach pendants (unknown if suitable for modern units).

    displayMessage.zip

  • Possible ways of using a real-time image of a flat object in the process of painting that object with a robot.

    • TygerDawg
    • February 19, 2025 at 3:01 AM

    Another approach, one I've used many times.

    It requires use of a simulator or other program that can do CAD-to-Path functionality (e.g., RoboDK or similar).

    As a guide, this is what I've done in SolidWorks and various simulators:

    1. create a 3D CAD model of your workpiece
    2. insert a JPEG of your photo as a Sketch on the desired surface
    3. scale the image to match reality (test to be certain)
    4. build another Sketch on the surface that describes the robot path(s) to be taken...you should be able to see the path(s) overlaid on the image overlaid on the workpiece
    5. assign three "feasible, touchable, and repeatable" physical point locations on your workpiece and CAD model that will be used to construct an arbitrary coordinate system ("Frame") for "calibration" of your CAD model to an acceptable approximate location of your path in real space (Origin point, X-Axis point, Y-Axis direction point). Construct any type of "appliance" or "location aid" needed to help you get a good point and add that to the CAD model
    6. create a simulated workcell with your workpiece
    7. use the simulator's path development functionality to develop a program to move the EOAT along the path(s) relative to the part frame
    8. teach the as-installed three points in the physical workcell to establish the Frame (you'll need an accurate pointer & tool transform for your robot). This will be used in your functional robot program.
    9. make your program run the series of path points relative to the Frame
    10. tweak & adjust

    Accuracy required depends upon the requirements of your process. Spray painting can generally have sloppy tolerances. Precision pin striping requires tight tolerances.

  • Best practice: Parametric placement with automatic path generation -> Software ?

    • TygerDawg
    • January 11, 2025 at 3:02 AM

    It appears that you are seeking a solution that completely removes the human brain's cognitive analysis. spatial perception, and problem solving functions from the robot programming and integration task. I think it will be difficult to find a COTS solution for this. OK, I'll be negative: impossible to find. I've been involved with offline programming technology for a long time. Customers or Managers wanted a miracle software solution so that they would not require highly paid robot programming personnel. The solution was never developed.

    The solution would require a lot of background programming that would accommodate a lot of variables and random events that would inevitably occur. Products arrive dimensionally correct, but vary within a tolerance (and sometimes out of tolerance). Actuators, sensors, and bearings degrade over time. Friction and static vary with humidity. 3rd Shift operators tinker with the bits while no one is looking.

    That level of programming is prohibitively expensive for a general-purpose COTS product. The digital model for simulation is built to 16 or 32 (or perhaps it is now 64?...I'm getting old) decimal place accuracy. To make a simulation-developed program work perfectly would require the physical installation match the digital model to a high degree of positional accuracy in 6DOF. This requirement can be attenuated somewhat if the process tolerance requirements are loose. But all bets are off if an inattentive janitor or material handler puts a trash can or parts box in the wrong place.

    The optimistic Engineer in me says that it is possible and can be done, but I doubt anyone would want to pay the price or schedule required to develop such a specialized solution. And this discussion doesn't include the required product support to keep it compatible with the latest CPU or OS updates.

    Given the current state of the art, IMHO your best course of action for success in the short term is to use a suitable simulator to achieve that 70%-90% reduction in programming labor, then use skilled humans to complete the task. And, later, to fix things when they blow up or change.

  • CS8C TX90

    • TygerDawg
    • September 17, 2024 at 4:46 AM

    Here is a code snippet from a long time ago (2008). This is a motion subroutine of a larger application. It moves workpieces from one stack to another stack.

    The statements instruct the robot to move through a sequence of various locations with approaches, linear moves, retracts, and transitions between locations.

    Much more than this, and you'll need a VAL3 training class.


    <code>begin
    // ==========================================================================
    // PROGRAM RightToLeft()
    //
    //OPERATION:
    // Starting point jSafe0 in pgm PickAndPut()
    // pick from RIGHT stack to LEFT stack
    // assumes RIGHT stack is full, LEFT stack is empty
    //
    //called from: PickAndPut()
    //
    //incoming parms: none
    //outgoing parms: none
    //
    //by XXX 6mar08
    // ===========================================================================
    nFirstIndex=nStackQuantity-1
    nLastIndex=0
    nStep=-1
    //
    for nIndex=nFirstIndex to nLastIndex step nStep
    pMovePoint=pRightStackBot
    pMovePoint.trsf=pRightStackBot.trsf*{0,0,-nPlateThickness*nIndex,0,0,0}
    //
    movej(jOverRightStack,tPickTool,mTraverseSpeed)
    movel(appro(pMovePoint,trApproach20),tPickTool,mApproachSpeed)
    movel(pMovePoint,tPickTool,mPickSpeed)
    waitEndMove()
    io:bOut0=true
    delay(nPickDelayTime)
    movel(appro(pMovePoint,trApproach20),tPickTool,mPickSpeed)
    movel(pOverRightStack,tPickTool,mCarrySpeed)
    movej(jRightShowPlate,tPickTool,mCarrySpeed)
    //movej(jOverRightStack,tPickTool,mTraverseSpeed)
    //movej(jRightShowPlate,tPickTool,mTraverseSpeed)
    //
    movej(jMiddle,tPickTool,mCarrySpeed)
    movej(jLeftShowPlate,tPickTool,mCarrySpeed)
    //
    // traverse across from RIGHT STACK to LEFT STACK and place the plate
    //
    pMovePoint=appro(pLeftStackBot,{0,0,(nPlateThickness*(nStackQuantity-nIndex))+nDelta,0,0,0})
    pMovePoint.trsf=pLeftStackBot.trsf*{0,0,(-nPlateThickness*(nStackQuantity-nIndex))+nDelta,0,0,0}
    movej(jOverLeftStack,tPickTool,mCarrySpeed)
    //movej(jLeftShowPlate,tPickTool,mTraverseSpeed)
    //movej(jOverLeftStack,tPickTool,mTraverseSpeed)
    movel(appro(pMovePoint,trApproach20),tPickTool,mApproachSpeed)
    movel(pMovePoint,tPickTool,mPutSpeed)
    waitEndMove()
    io:bOut0=false
    delay(nPutDelayTime)
    io:valve1=true
    movel(appro(pMovePoint,trApproach20),tPickTool,mApproachSpeed)
    waitEndMove()
    io:valve1=false
    movel(pOverLeftStack,tPickTool,mTraverseSpeed)
    //movej(jOverLeftStack,tPickTool,mTraverseSpeed)
    //
    endFor
    end

    </code>

  • vacuum gripper

    • TygerDawg
    • June 26, 2024 at 5:42 PM

    I have taught this to robotics students for years. Sizing vacuum components is a complex task because compressed air and vacuum are not linear functions. This is not to be taken lightly if you want to deliver a safe and robust solution.

    Every vacuum component supplier that I know will provide an Engineering section of their catalogs for users to make correct calculations. These calculations will provide such important information as maximum lifting force, minimum quantity and size of cups, flow rates, and minimum pipe sizing. Unfortunately, they all seem to calculate the values a little bit differently, which can be confusing. Schmalz, Piab, Camozzi, others all have excellent Engineering resources.

    Then after the calculations are made, the task is to select the correct style of cup, cup material, and method of vacuum generation. Finally the end effector structure and piping is designed. The supplier's Technical Support Engineers are there to help navigate the complexity.

    Tread carefully laddie, there be beasties and goblins in these woods.

  • Accurate part pick up

    • TygerDawg
    • January 27, 2024 at 5:01 PM

    Does "machine tending" mean "machine tool load & unload"?

    If so, then the cycle time is probably a sufficiently-long duration. Then it may be possible to incorporate a place/re-position/re-grip sequence of steps that may mechanically give you the grip accuracy you need.

  • Crafting an Affordable DIY Collision Detection Sensor for Industrial Robot Tools: Exploring Effective and Low-Cost Solutions

    • TygerDawg
    • January 27, 2024 at 4:53 PM

    IMHO I think that you are going to discover that there is no solution that meets the requirements of

    1. inexpensive
    2. easy to implement
    3. does not restrict working volume or access to workpieces
    4. is simple and fast to build a DIY system

    In my teaching episodes, I finally and reluctantly acknowledged that students will be students: clumsy, underskilled, impatient, refuse to read instructions, entitled to believe there are no consequences to their reckless actions, and they already know everything. They also think every machine should have the same ease-of-use and speed of reaction as an iPhone.

    My teaching method guides the students into the realization that robotic programming is a game of diligence, discipline, and patience. My teaching rigs all have surfaces of home-store semi-rigid foam insulation boards for the inevitable crashes and scrapes. My fundamentals teaching methodology is delivered in progressive steps using a flat table surface and a simple pointer tool:

    1. Rigorous skills development of motion mode and pendant fundamentals with practicum before proceeding to simple programming. Students must pass this 100% before moving on to demonstrate competency.
    2. Simple joint mode teaching of a path. The printed path contains linear & circular paths, but the first exercise is strictly non-oriented point-to-point joint moves. Tool-Z is not aligned perpendicular to the surface and this enhances understanding of coordinate systems and limits of joint mode teaching.
    3. Same path, but this time introduce Tool-Z perpendicular to surface for ease of teaching. Base mode and tool mode teaching used.
    4. Same path, but Tool-X (or Tool-Y) must now change orientation and follow the tangent of the path.
    5. Same path, but now linear & circular interpolation is introduced.
    6. Same path, but now variable speeds and variable tool tilt orientations are introduced.
    7. Same path, but now variable point accuracy is introduced (which is manifested in a zig-zag path segment).

    By the time the students get to Step 3 they are usually very competent and don't crash the robot. Usually. After Step 3, the remaining steps progress very rapidly. But crashes still occur in the foam board, so no damage is done.

    After students complete this sequence, we move on to more advanced 3D paths with integrated gripper & sensor tasks. Home-built fixtures and teaching objects are clamped on the foam board.

    If you are committed to a crash sensor, then here are a few ideas for DIY devices to ponder:

    • investigate "3D printed flexure robot gripper" for images of ideas of things already done
    • flexible strip resistors attached to flexible elements, then measure the voltage through the resistors to sense defection. I had a student develop this idea for sensing the bending position of a finger inside a glove. This would require electrical integration & programming.
    • 3D printed hacked copies of commercially available breakaway devices...maybe could be done. Maybe.
    • Tool-Z collision devices may be sufficient with spring-loaded designs.
    • Depending on how sophisticated your robot of choice is, there may be an opportunity to monitor the current of the wrist joint motors. Too much current = crash or collision. But this is a tricky software solution.
  • Robot Cell moving day-How to secure for shipment

    • TygerDawg
    • December 12, 2023 at 6:46 AM

    And for another opinion...

    When I was involved with shipping arms all across North America, we had the following guidelines:

    • even if we specified an "air ride van" for shipping via truck with a softer ride, we always assumed the arm would be subject to severe road vibrations and those vibrations could possibly damage joint transmission gear teeth through repetitive stressing
    • arm would be securely bolted to machine base or heavy shipping pallet
    • arm would be tucked to minimize extended arm link mass that, when subject to road vibrations, would impart inertial loads on the links that would create unwanted torque and possibly damage gear teeth...tucking position was chosen to minimize this effect and we never sent arms in extended vertical configuration
    • we would always use some type of heavy padding in the tucked position...polyurethane foam, layers of heavy bubble-wrap, blue/pink house insulation foam, or even folded shipping blankets to help mitigate vibration-induced relative motion...the inertia would be absorbed through the housing element upon which it rested
    • Once a bunch of Valedictorians shipped our equipment during winter on an open flat bed trailer. The equipment arrived at the destination covered with winter salt spray and rusted over. After that, we sprayed preservative on all unpainted surfaces and shrink-wrapped everything
  • Kawasaki + Cognex vision setup

    • TygerDawg
    • June 28, 2023 at 4:31 AM

    Check with Cognex. They offer an AS Language template file and PDF instructions for Cognex to Kawa communications.

  • Frame function

    • TygerDawg
    • June 21, 2023 at 4:55 PM

    It has been many years since I was deep into V+ language, but something doesn't seem correct with your code. Perhaps. Maybe. Possibly.

    Your method of calculating and using the INVERSE transform dif bothers me for some reason, and I cannot discern why (too many years ago). :sleeping_face:

    Establish a frame (user coord sys) in space with

    SET pallet = FRAME(loc.origin, loc.x.axis, loc.y.axis, loc.origin)

    This assumes that you taught the three frame locations properly (and taught with an accurate tool):

    • loc.origin is at the ORIGIN of the new frame
    • loc.x.axis is ON the X-AXIS of the new frame
    • loc.y.axis is somewhere in the positive X-Y plane of the new frame (not required to be ON the Y-AXIS of the frame, but is nice if it is)
    • fourth argument defines the location of the new frame origin
    • orientation of the three frame location points is irrelevant, only X, Y, Z are used to calculate the frame

    Do all of that, then your command to calculate a point relative to the new frame pallet is:

    SET pick = pallet:TRANS(x_offset, y_offset, z_offset)

    BUT it is critical to recognize that the new point pick has the orientation of the new frame pallet. That is, if your frame pallet Z-AXIS is pointing UP (e.g., towards the ceiling) and you desire to have your gripper Tool Coord Sys Z-AXIS pointing DOWN to the workpiece (e.g, towards the floor), then you must reverse the orientation.

    SET pick = pallet:TRANS(x_offset, y_offset, z_offset):TRANS( <either RX = 180, or RY = 180> )

    If frame pallet is taught with foresight, then the pallet Z-AXIS is pointed the direction you need for the tool Z-AXIS, you just have to adapt the X, Y, Z offsets to match your physical position.

  • Is this idea even feasible? If yes, where do we begin?

    • TygerDawg
    • June 21, 2023 at 4:29 PM

    Consider doing searches for "how to build an electric go kart." Plenty of people have posted their attempts at building various vehicles with various levels of performance. Reviewing these posts may enlighten you to the world of mechanical engineering, mechatronics, and power transmission.

    Unsure why you need Raspberry Pi for this. But one of our local University Summer Engineering Camps does R-Pi programming with the MIT SCRATCH application for drag & drop R-Pi programming. Very easy for non-programmers.

    After review, you may choose to punt on the idea of a ZooWagon and change the project scope to a little ScooterBot to run around the house instead. I did not look, but I would suspect that there are 100's of free project plans for such kid-friendly projects. What you DON'T want to is be overly ambitious and discourage your kids from exploring this world because they were overwhelmed by complexity and got burned out.

    BTW: "pulling a wagon around...the zoo." That's called exercise, isn't it? Just sayin'. :winking_face:

  • Tool rotation in movel bugged?

    • TygerDawg
    • May 18, 2023 at 6:03 AM

    Is difficult to discern because there is no code provided.

    But it looks like this bug may be a logic error in your code, not so much the value of the rz.

    It has been many years since I touched a Stubby arm. But if I was programming this, I would establish a frame at the bottle. Then calculate some quantity of locations relative to that frame to provide the cap turning action. Execute those motions as a subroutine.

    You probably do not have the infinite-turn Joint6 option, so you must verify you start the turn at a good J6 location each time. Else you will accumulate J6 degrees.

  • robot arm for 3d printing

    • TygerDawg
    • May 18, 2023 at 5:48 AM

    3000 mm height would be a challenge, given the vertical orientation of almost any extruder tool. Desiring a second hand robot suggests you possibly have a limited budget. Another challenge.

    For really BIG workpieces, I concur with HawkME on the gantry suggestion.

    I have seen Thermwood's Large Scale Additive Manufacturing machine up close. It is a large format gantry dispenser.

    Thermwood LSAM - Large Scale Additive Manufacturing

    It is used to dispense polymer structures and then machine the surfaces smooth. The machine uses Siemens PLC + CNC + servo drives. CNC path programs developed from CAD-CAM software (with additional customization). Servo drives for axis motion and large raw material melting extruder and smaller precision dispense extruder. Many man-years of engineering development work was done on this impressive beast. Very expensive. The last I heard (couple years ago), the US Government ordered so many LSAMs that Thermwood had a multi-year backlog.

  • How to determine the maximum payload for a robot

    • TygerDawg
    • May 18, 2023 at 5:32 AM

    Robot specification sheet will specify mass moment of inertia MMI limits for Joint5 and Joint6, sometimes also Joint4.

    MMI values will specify the maximum torque the joint can handle by motion equation

    Torque = (MMI) x (Alpha).

    MMI is a directly related of your end effector mechanical design (mass, size, etc.). For example, SolidWorks used its Mass Properties function to calculate an end effector assembly model's MMI about specified axes.

    These MMI limit values are developed experimentally by the manufacturer for full payload & full speed motion in all configurations. Limits are due to motor torque and gear train design in order to meet repeatability specifications and survive the design service life. MMI values can be exceeded if speed is reduced. The reduction amount is not specified and usually determined by experimentation and experience.

    If the end of arm tool has any additional loading (such as forces applied to a milling cutter), then these forces will apply torques to the joints 5 & 6. These additional torques must be accounted for in the overall payload analysis.

  • choosing the right robot for an high school robotics course

    • TygerDawg
    • November 18, 2022 at 8:25 PM

    First things first: decide what it is you wish to teach your students.

    • fundamental exposure to robot technology
    • simple programming via Teach Pendant Programming TPP
    • more complex programming with structured language
    • simple pick & place or more complex path applications
    • systems integration (sensors, grippers, vision systems...)

    Simple: less time, effort, cost. Students get less learning and skills.

    Complex: more time, effort, cost. Students get more skills (but is more needed for High School?)

    Do your students already have some programming experience? Many do not. Nor have patience to read, study, and learn.

    My University / Engineering Technology labs use small Kawasaki units. Selected because of TPP +AND+ a very powerful & easy structured language together. Not terrible. But I only use TPP because my students are iPhone generation and don't know, and don't care, about structured language programming.

    I use RoboDK extensively in my course. I want students to know what is offline programming and also to help visualize 6DOF motion. Education license is what I have, minimal or no charge. Full function, six months. RDK is a very good package once you get over the learning curve. Can integrate Python programs for exotic actions, so can do "programming" with that.

    I wanted to evaluate an interesting new small lab scale 6-axis cobot from IGUS. But these products are new release this past Spring and I have not been able to get my hands on one yet. I ***think*** I got budgetary pricing of ~USD$9K (?) each at one time. They also have other types like gantry units.

    Skyefire mentioned Automation Studio. Fabulous product, insanely expensive for Academic use. I discovered a much less expensive alternative that I use in my classes, AUTOSIM PREMIUM from IRAI France. Less polished than AutoStudio or the equivalent FESTO software package, but very powerful anyway. I use it to teach students fundamentals of pneumatic circuits. My colleague uses it for hydraulics technology. It can also be used for PLC and industrial control circuit simulation.

    IRAI France also has an interesting "robot" simulator package for HighSchool or less called MIRANDA. You may find this interesting. Find link to it on their website.

  • Robot to run a microphone on a 2D plane? (buy, make, purchase?)

    • TygerDawg
    • October 27, 2022 at 9:26 PM

    Many, many options available for this. Some may even be inexpensive.

    Search strings:

    • Cartesian robot or Cartesian actuator
    • XYZ Gantry or XYZ Gantry robot

    You need to define your maximum Z-axis height.

    Resolution is usually very high, but you need to define your needs.

    Motors used to drive the axes may cause noise problems depending upon your required traverse speeds.

  • Kawasaki -- Right-hand or Left-Hand?

    • TygerDawg
    • June 10, 2022 at 6:24 AM

    Concur.

    Staubli, Kuka, etc., have +World_X axis pointing forward from the robot.

    Kawa is rotated, +World_Y points forward from the robot.

    The Kawa World coord sys is like most milling machine tools: standing at a mill, +Z is UP, +X points to RIGHT, and +Y points FORWARD.

    Why they choose to do this and be inconsistent with the rest of the robot market is one of this life's great mysteries.

    On my small teaching robots, the Tool Flange coord sys is also a bit different. -Tool_Y points through the dowel pin. So +Tool_Y points opposite the dowel pin. :thinking_face:

    Kawa makes a nice robot system, but it has its quirks. Perhaps my assumption/assertion that Kawa is essentially a captive company for Toyota (and, to a lesser extent, Ford) is true. So they don't need be market-consistent or friendly.

  • About shifting based on the FRAME coordinate system

    • TygerDawg
    • April 16, 2022 at 5:35 PM

    As a rule, I ***NEVER*** change the BASE of a robot. In my particular world, there are just too many problems that doing this could create. Especially if I change the BASE on an application and leave it for some other poor goombah to discover by accident. Most robot users/programmers do not understand transformations, so why leave these boobytraps out there for others to find?

    It's been many years, so please forgive if my transformation math is not exactly correct.

    If I remember correctly, SHIFT only applies to points in the BASE coordinate system.

    So the problem statement is:

    GIVEN:

    • arbitrary FRAME (call it: frame1) out in robot space taught in the default BASE robot coordinate system
    • an X-Y-Z pattern of points relative to frame1 (call them pt1 = frame1+TRANS(X1, Y1, Z1), pt2 = frame1+TRANS(X2, Y2, Z2), etc)

    FIND: I desire to move the points pt1, pt2, etc. to my dimensional specification within frame1 (sort of like using SHIFT inside frame1)

    SOLUTION:

    • OPTION1: Use TRANS function to calculate transformations within the frame1 coordinate system
      • command: pt1_new = frame1+TRANS(X1+X1_shift, X2+X2_shift, etc)
    • OPTION2: inverse the frame1 back to BASE and then use SHIFT within the BASE coordinate system
      • command: pt1_new = SHIFT(-frame1+pt1 BY X1, Y1, Z1)
        • ASSUMPTION: all of the various coordinate systems are aligned suitably. Else it may be necessary to change the order of the X, Y, Z components to get the desired changes

    Like I said, it has been many years and my command might be incorrect. I'd have to find a machine and test this and prove the correct trans math order. But I hope that you are understanding my intention.

  • Ggg

    • TygerDawg
    • November 9, 2021 at 3:24 AM

    You may not ever resolve this. Reasons:

    • robots are repeatable, not accurate
    • path accuracy is a tubular tolerance zone around a mathematically perfect path definition and is dynamic based on arm configuration (varies all over the work volume)
    • a robot arm ain't a rigid CNC machine tool
    • servo parameters for individual motors may need tweaking to smooth out those bumps...not recommended
    • some arms are meant for carrying heavy loads point-to-point, not to follow a path

    Servo-tweaking:

    Years ago I was engaged in a project to resolve an out of tolerance condition on a Staubli robot path doing high-precision cutting. The application was analyzed. It showed that the path segment containing the tolerance deviation occurred when the J2 and J3 links went through an inflection point. The arm extended, paused, and then retracted. J2 & J3 opened, paused, then closed. Poof: consistent path tolerance deviation at the inflection point.

    How resolved: some really clever French servo programmer engineers "super-tuned" the servo parameters and sent to us a new servo parameter setting. Problem was corrected. But I hate solutions like that: implementing a "special" in the field and leaving it for the next poor b@st@ard to find.

    Wrong arm use:

    I was tasked with a dispensing job using a refurb BigYella spot welding beast. Dispensing path tolerance is very forgiving. Except this big arm had to move its J5 & J6 in wild gyrations as the dispensing gun traversed around a sharp corner. Angular accelerations were high for J5 & J6. Result: squiggly dispensing lines that could not be programmed out. Cause: big J5 & J6 gears made for carrying heavy payloads, not doing finesse work.

Advertising from our partners

IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Advertise in Robotics
Advertise in Robotics
  1. Privacy Policy
  2. Legal Notice
Powered by WoltLab Suite™
As a registered Member:
* You will see no Google advertising
* You can translate posts into your local language
* You can ask questions or help the community with your knowledge
* You can thank the authors for their help
* You can receive notifications of replies or new topics on request
* We do not sell your data - we promise

JOIN OUR GREAT ROBOTICS COMMUNITY.
Don’t have an account yet? Register yourself now and be a part of our community!
Register Yourself Lost Password
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on Google Play
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on the App Store
Download