Posts by x9tech

    For finding out when the bot crashed... there are 2 options that I have used before:


    1) You can parse the data from the realtime interface with a PC application and get the status bits for the bot.


    2) In your program, run a separate thread that keeps changing a Modbus value (say count every second from 0 to 255 and then start over). You can read this from your PLC, when the number stops incrementing, the program stopped.


    For programming from PC... you can install VNC on the bot and remotely operate the pendant from your desk.


    For putting into a box... with some position scripting in URScript you can take the measurements of the box and compute the relative position movements from a corner of the box position (box origin). That way you can just teach 5 points (the box origin with wrist angle) and then make relative movements from there.

    First, download the UR10 user manual and read through it - reading an Analog input is well covered in there.


    For your "function", that can mean a lot of things, but in the truest sense of the word, you want to create a script, add it to your program, and then call your function with a "Script Code" command. That is also covered in the manual. From there you might have some more specific questions.

    maybe:


    - wire up "operator input needed" light to output
    - wire up "yes" button on panel to input
    - wire up "no" button on panel to input


    At decision time, turn on the light and wait for either the yes or no button to be pressed.


    I'm not aware of a way to accomplish a yes/no decision with a single button or use a Digital Input as a response to an on-screen pop up.

    Yikes. That almost sounds like a loose connection on the data line between joints or something.


    On the UR5's there's a thick pair of power cables that runs in series to each joint as well as a really tiny pair that runs data (think RS485). It almost sounds like your controller begins providing power but your data comm is flaky.


    There are quite a few spots to take apart and things to try, but if this is brand new / under warranty, this is your distributor's problem and should be a reasonably quick RMA replacement.

    In all the weird hardware failures I've seen, this is not one of them. Was this right after a firmware upgrade on the joints or controller?

    Short answer - you can't.


    You can however, make a predefined .urp that simply references a single script file as it's contents, and update that script file remotely and run the program to execute it.


    We have gone to lengths of writing a "UR Compiler" to package up a bunch of programs into a single script and control the execution via modbus registers for performance.

    My preferred method of doing this is interpolate_pose().


    Make installation variables for the max analog value + max robot position, and the min analog value + min robot position.


    Then in your robot program, you can get the current analog value, get the alpha (% between actual and max relative to the min/max range), and use interpolate_pose() to generate the "in-between" waypoint/pose at the water level. At that point you have your "water height" pose in a variable and can move to it however you like.

    How the tablet app works is obviously a consideration, but generally anything you would be able to automate with a camera on a windows PC you should be able to automate in a more robust way interacting with the windows app control tree directly.


    Here is a doc that covers how to iterate a control tree. https://msdn.microsoft.com/en-…op/ee671590(v=vs.85).aspx


    And here's a more in-depth writeup: https://www.codeproject.com/Ar…crosoft-Automation-Framew


    A reasonable play here would probably be a C# app running on the tablet, polling some Modbus register(s) on the UR bot for what to do next (there is a great NModbus library already out there for C#). For a "start machine" type workflow, the app would see the modbus value that represents "press the start button", then the C# app walks the control tree, finds the start button control (either by text, button size/position, or whatever), and issues a click event for that button. The robot has a built-in modbus server with a lot of registers reserved for application-specific use, so you can keep it simple that way and have plenty of room for adding registers for status, error codes, etc.


    To take it one step further, if this is like a wireless remote tablet, I'd take a look at what protocol the vendor's app is using to control the machine (run wireshark on the tablet to capture the traffic), and if it turns out that it's a simple socket command or something like "START JOB 3", you might even be able to issue the command directly from the robot using socket scripting and make the whole thing simpler.

    There are a couple variations on this -- first off, if you're working with a Windows interface, it will be a lot more reliable to integrate a software agent on the OS that can control the application (perhaps linking to the robot via Modbus for job selection and job control). Windows has pretty good accessibility API's for iterating over an application's visual control tree and taking actions.


    If that's absolutely not an option and the robot has to press the buttons, then the next best step might be to integrate a vision camera (Cognex etc) with the screen, and train it to recognize the patterns of the buttons you're looking for. Provide the X/Y coords of your target GUI button to the robot, and the robot can navigate to those coordinates and push the screen.

    I don't think there are any built in time tracking functions... you will probably need to:


    1) create a script file that starts a thread that increments a counter by +1 and then sleeps for 1 second, and include it in your program
    2) log the counter value at the end of your cycle and reset it back to 0


    To quickly see the last 20 you could throw the results into a list array of 20 values and view it on the variables tab as it runs.

    The lack of nested subprograms is probably the worst design flaw of the UR software.


    Making a script file for your reusable functions and then calling those functions as a script node in your program is probably going to be your best bet.

    Most straightforward would likely be transmitting the X/Y coordinates to the UR10's internal modbus server.


    No idea what language you are using, but here's the gist:


    1) Computer takes picture


    2) Computer sends X/Y coordinates via Modbus TCP registers to the robot's IP address; registers 128-255 are available for this use. Keep in mind that you will need to convert your measurements to a Modbus-friendly value (0-65535, whole numbers only).


    So for example, modbusValue = ((X*10)+32767)


    If your X coordinate were -5.2 in the example above then your modbus value would be 32715.


    3) Add a modbus "Register Input" on your UR10 installation > Modbus Client. Set IP address to 127.0.0.1 to point it at the robot's own modbus server. You will see the value update in real-time on that screen whenever the PC sends it.


    4) In your UR program you now have that modbus register value available to you and can convert it back to your unit of measurement with ((X-32767)/10). Then apply it to whatever movement you need (lots of info in the UR scripting manual about pose_add / pose_trans etc if you need to move the robot a variable distance in any direction).


    Hope that helps you get started.

    If it's a long tool, then the configured length / center of gravity might have something to do with it.


    That said, people always get really technical about calculating the payload. My technique: press freedrive. If it takes off toward the sky, payload is set too heavy. If it takes off toward the ground, payload is too light.


    Once it doesn't move, then press freedrive and move the robot up and down with your hand - you'll have the payload value correct when it takes the same amount of force to move it up and down.

    Yeah, the PolyScope software and any sort of UR robot simulation is one and the same.... it's certainly not going to get any lighter from there.


    URSim is actually not a crazy intensive/heavyweight VM (not much more than the normal ubuntu VM overhead, anyway). What made you feel like it was too much?