Posts by SkyeFire
-
-
Well, there's lots of different types of encoders. The simplest encoder is the "quadrature" encoder, which produces 2 sine waves at 90deg relative phase angles. Each full wave indicates a specific fraction of a rotation, and the phase angles indicate direction.
I would guess that a "pulse encoder" is something similar, but perhaps putting out square waves instead of sines. Or, it could be something different, like a pulse-train signal that triggers every 1/X degrees, and a separate signal that indicates direction.
There are, of course, many variations on the theme. So brand-specific decoders or translators may be necessary to convert from one standard to another.
-
Based on my recent experience, 2sec between closing and re-opening a channel. And consecutive EKI_SENDs need to be separated by 0.5sec, minimum.
-
KRL can be programmed entirely on the pendant, but for complex programming, a PC editor is far superior. You can use any simple ASCII text editor like NotePad++.
One thing to watch out for is that programs created on the pendant add a great deal of "cruft" that is not actually used by the robot compiler, but is used by the teach pendant, to enable menu-driven programming. So when you create a program with the pendant, then copy it out to edit offline, what you see in the text editor can look much different from the appearance on the pendant. There is a decent KRL-specific editor for free from orangeapps.de, called OrangeEdit, which has different display modes that can help with this.
-
I do not believe that the batteries are used to power the encoders or resolvers. Rather, the batteries are used to retain memory.
For example... well, it's been a while since I worked on an ABB, but if I recall how the IRC5 works correctly, each servo motor has an absolute encoder mounted directly to the drive shaft. But even a few degrees of arc motion on an axis requires multiple motor rotations, due to the gear ratios. So, there is a memory chip (SRAM or something similar) in the robot base that retains a count of how many times the encoder has made a full rotation, and in which direction, since a zero-reference was performed. Normally, this memory (and the battery charge) is maintained by the power from the controller. However, when the controller is powered down, or the cables are disconnected, the batteries prevent the memory from being erased. This way, the robot does not lose its zero reference. However, if the batteries are faulty (usually from old age), the robot will lose its zero reference every time main power is lost. This is also why it is necessary to replace these batteries with main power active. Removing these batteries without the main power will cause the zero reference to be lost.
KUKA robots use a slightly different method. The memory chip in the base of the robot arm is non-volatile EEPROM, so no batteries are required in the robot base. However, batteries are required inside the control cabinet to guarantee that, when main power is lost for any reason, the central computer and communications can run from the battery power long enough to perform a controlled shutdown of the operating system, and complete a memory update of the EEPROM chip.
-
I believe there is a dedicated signal(s) in ProfiSafe (and perhaps EIP?) that can allow an external PLC to command the controller to shut down remotely. I'm not aware of a means to do this from within a KRL program, however.
-
What does your EKI setup look like? How often are you TX/RXing data, and how often do you open/close the ports? Are you sending raw ASCII, Binary, or XML-formatted data?
A coworker eventually worked out (through loooong trial and error) that, for unknown reasons, some communications transactions, even when they appear to work properly, cause the Error or Warning counter for EKI to increment. And it appears that, once this counter hits its maximum value, rather than roll over, crashes KSS entirely with a Sys14 error.
To see if you have the same thing going on, open up the DiagNostic Monitor and scroll down the list until you find the EKI module and select it. Then monitor the "Error" and "Warning" entries as you run your process to see if they increment. If so, then you will eventually top out the error counter, and cause the Sys14 fault.
The fix we eventually found was to set both Messages and Display to "warning", in the EKI channel config file.
One thing that I'd like to dig into someday, but will probably never have time: this was only a problem on channels that used EKI to transfer raw ASCII strings between the KRC4 and a Siemens PLC. Other channels, which exchanged XML-formatted data with a "normal" PC application, did not (well, not after we ironed out some subtle formatting issues). I'm not sure if the incrementing error was due to the PLC, or simply to using "raw ASCII" mode.
Other things to look for: you should have 2sec, minimum, between closing an EKI channel and re-opening it. Also, consecutive EKI_SENDs on a given channel should have at least 0.5sec delay between them. Violating either of these rules will cause the error counter to slowly, silently, stack up until the Sys14 fault eventually occurs.
A cold boot resets the counters. However, nothing else seems to.
-
Exact implementations vary, but it's mostly about reverse-engineering a sphere.
When you mount a tool to a robot, the robot knows nothing about that tool's dimensions. However, what the robot does know, by design, is the location of it's "zero tool" -- that is usually the center of the mounting flange where end effector tooling is mounted to the robot. Because this point is part of the robot, the robot always knows where this "zero tool" is located in space, relative to the robot base, from the forward kinematics of the various axis angles.
So, when you perform a tool calibration by (on most robots) touching the tool tip to the same point in space from four different angles, the robot knows nothing about the tool, or the point in space you're touching. But it does know the location of the "zero tool" at each of those four measurements. This then becomes a problem for determining the center of a sphere from points on its surface:
http://math.stackexchange.com/questions/8947…-given-4-points
http://stackoverflow.com/questions/1360…ains-4-points-c
https://www.safaribooksonline.com/library/view/e…ter011-02.xhtmlThe center of the sphere is the fixed point in space being used as the reference for the tool tip, relative to the robot base. At that point, it becomes a matter of finding the XYZ transform that, using the "zero tool" as the point of origin, will make the transition from the spatial location of the "zero tool" at each of those measurement locations to the reference point.
Technically, in a perfect world, any one of these 4 points would be enough to determine the tool dimensions relative to the "zero tool," once the reference point (sphere center) has been determined. However, since the robot is an imperfect device, the algorithm performs this transform/fit calculation for all 4 measurements, and attempts to average their results. If the error range between the four measurements are too great, the robot decides that the measurements were insufficiently precise and throws an error, usually telling the operator to start over and try again. -
It really depends on your sub-specialization. Robot programming and PLC/controls programming has some overlap (most people who do one can do at least a little of the other), but the mechanical design side (tooling, fixturing, etc) is a completely different thing. Process engineering (sequence, line flow) is an entire art of its own, and line layout design (how to fit 10 acres of equipment into a 5 acre plant floor) is another. Then there's various electrical, pneumatic/hydraulic, welding/cutting, etc etc etc....
My opinion here is going to be biased by my own experience, but I think that the strongest "bottom rung" starting position is in Robot and/or controls programming, if that's your thing. Someone with a strong mechanical-design bent, for example, might not want to try that route.
At the end of the day, automation will always depend on programming, and being on the "pointy end" of programming will expose you (and force you to learn) related items like machine safety, personnel safety, process flow, material handling, welding/cutting, adhesive application, communications debugging, systems integration, some degree of electrical and mechanical design/debug, and so on. If you work in a part of the field that does a broad variety of applications, you'll pick up a broad range of experience that will be invaluable whenever you encounter a "we've never tried doing this before" situation.
For myself, I started as an Electrical Engineer with a sideline in computer programming, and turned out to have a feel for robots and control systems. A good automation degree track, or industrial training course, will concentrate less on specific sub-fields and more on how the sub-specialities interact. When I fell (or was I pushed?) into robotics, my formal EE education didn't help me a lot, but all the hands-on skills I learned during the lab portions of earning my degree (I went to a school that was much more hands-on than most) absolutely saved my bacon many times.
Software is going to be the core of almost any automation, and probably even more so as the state of the art advances. But "ivory tower" programmers have often been the bane of my existence -- people whose entire skill set and experience is in design and simulation software, who don't understand the differences between the map and the territory. IMO, being an effective automation engineer really requires time "in the trenches," getting your hands dirty and experiencing what happens when the project plan meets the production reality (hint: BOOM).
As the state of the art advances, more advanced programming skills will probably become increasingly valuable. But really understanding how the software interacts with the physical world of actuators and sensors is even more critical -- there was the project where the design team approved and purchased an entire set of servo actuators for the end effectors of the robots for a project... and had no idea that those actuators had no way to be interfaced to the robots we were using until all the hardware was delivered and I got assigned to make it all work, somehow.
Bottom line: it's all about the interactions. A good automation engineer knows enough about electrical, mechanical, pneumatic, hydraulic, process, safety, etc, to avoid making really stupid (and expensive) mistakes (which usually means knowing when and where to find the right people to ask detailed questions of). But the single most critical skill for someone in automation is to grasp how the different parts of the overall system interact with each other, predict which interactions are benign and which may be fatal, and envision what can go wrong before it does and come up with ways to ensure that, at worst, the machine fails safely for all foreseen (and unforeseen) ocurrences.
Automation will always have its mechanical, electrical, etc sub-specialties. Pick the one that fits you best, and excel at it, but also be the best at understanding how your specialty interacts with all the others, and keep learning and expanding on that. Play it right, and you can start from almost any sub-specialty and end up as the person everyone goes to for answers. Automation will probably always be one of those fields that rewards the "jack of all trades" people as much as the hardcore specialists.
-
There's also the question of what kind of programming you are trying to do. I've never seen anyone use RobotMaster unless they were trying to do complex CNC machining with an articulated industrial robot, which is an entire art in and of itself.
If you're not doing something like that, it's probably more cost-effective to simply hand-teach your robot.
-
Multiple options exist. Absolute encoders, relative encoders, resolvers... all variations on the same theme. Yes, resolvers usually need some sort of AC voltage at the coils, but often from a "external" viewpoint, a resolver only needs a reliable power supply and handles the precise low-level coil AC internally.
-
Not all robots use this. Different brands do it differently.
For those robots that [b]do[b] require batteries in the base of the arm, this is usually because there is some volatile memory storage on a circuit board in the arm. Usually, this is in place to preserve the "zero" setting of each resolver/encoder, which is a unique property per axis per robot. For robots that store these critical offsets in volatile memory (like SRAM), a small power supply is necessary so that the memory is not lost whenever the control cabinet is shut down, or if the arm is disconnected from the cabinet. It is also valuable if, for any reason, one needs to swap a robot between different cabinets.
Some robot brands use non-volatile memory to fulfill this function (EEPROM). Most KUKA robots function this way. One of the tradeoffs is that EEPROM can have a shorter lifespan, in read/write cycles, than a batter-backed SRAM.
-
By "desktop", you mean a monitor?
If you have connected a DVI monitor directly to the motherboard, what do you see when power is applied to the KRC4? If nothing is seen on the monitor, not even a BIOS boot display, this implies there is a power or hardware failure.
-
200 is simply a status message.
1216 and 1232 are the ones that point at the problem. Since they're both "ACKN" messages, that means that the problem that caused them is no longer active. Which in turn means that the root cause is intermittent.
Since the message is limited to Axis 3, the most likely cause is one of the cables going to the Axis 3 motor. You'll want to check both cables for tightness, possible hidden breaks, insulation issues, etc. Also check the pins in both motor sockets -- if someone got careless, those pins can be broken loose from their moorings in the connector and start making intermittent contact.
-
What is the history of this robot? Is it brand new, or has it been working until recently? What changes have been made to it recently?
Have you tried connecting a monitor to the DVI port directly, and observed the boot process?
Have you tried pinging the robot's IP address across the network?
-
Mmm... ought to be, technically. You're still going to have to re-install KSS though, regardless. A chkdsk is only going to flag bad sectors on the hard drive so they aren't used.
-
...okay. "Ethernet connection" doesn't really mean anything. There's lots of different protocols that travel over Ethernet cables, and a lot of machines that "abuse" Ethernet RJ-45 connectors for uses (like limit switches, power, etc) that have nothing to do with Ethernet.
The only way to even begin answering this question would be to nail down a list of the communication protocols the Okuma supports, and an equivalent list for the KRC, and (hopefully) find one that matches on both lists. Then it'll be a matter of obtaining, installing, and configuring that communications option package.
-
Hm... I had heard that KUKA already has an option package that does something like this, for tightly-syncronized entertainment systems. I think the Harry Potter ride at, um, Universal Orlando? Maybe? I've never seen it myself.
That's probably pretty specialized, though.
-
Depends. What communication options does the Okuma support?
-
Probably have to wait until you have a working KSS install, after which you should be able to do a normal Archive Restore and get those settings back. There are ways to modify those files on the hard drive via Windows, but there's not much point, since that won't do anything to help your current boot problem.