Posts by Spl

• Status and Turn fault

If you specify the status and turn values, of course they will be evaluated, and because 0 is invalid value for both status and turn, you get the error. If you don't want status and turn to be evaluated, specify only the x y z a b c values.

• Can anyone advise it wrong with tool calibration or with robot accuracy?

Ok, so you have changed the laser's orientation so that the robot's wrist is orientated the same way in both steps. Now repeat your previous tests to see if in the grinding step there are still the same localization errors and report back to us.

• Can anyone advise it wrong with tool calibration or with robot accuracy?

Well, can you physically rotate the laser 90 degrees so that the robot's wrist is orientated the same way in both steps?

• Can anyone advise it wrong with tool calibration or with robot accuracy?

Is it possible to orientate the laser so that the angles of its tool correction is very close to the angles of the spindle's tool correction? Big orientation differences between the two tools will cause localization errors.

• Base change issue

Try the linear move without the c_dis approximation. Also, instead of \$base = ... you could use bas(#base,1) to set the base.

• Real Time Position Monitoring on PC while a Motion Command is running

Just put the code that sends the current pose to the server to sps.sub

• Keep Flange "looking" at object

Another way would be to define the face position as full pose (position and orientation), and then define a long tool that can reach that pose. And you could also dynamically adjust the tool z according to how far the face is from the rob root.

• Keep Flange "looking" at object

1) Calculate rotation matrix from the current abc angles
2) Calculate rotation matrix that aligns flange z-axis with the vector from flange to face
3) Multiply the flange rotation matrix with the second one
4) Calculate abc angles from the resulting rotation matrix

http://math.stackexchange.com/…ector-a-to-vector-b-in-3d

• Aray modification in .src

Define an integer variable that you increase when you fill the array, so that you know how many valid poses there are in it.

• Ethernet xml connection

Well, of course it doesn't work if you configure both PC and the controller as a server. If you start the XmlServer on the controller, you need a client in the PC that connects to 172.31.1.147:54600. The EthernetKRL_Server isn't a client.

• KUKA KRC4 and Beckhoff PLC Communication

Connect the CX to the X67.1 and see if you can scan the bridge to the CX's project. If you can, then you configure the input/output messages that are sent through the bridge at CX's side and at KUKA's side. If I remember correctly, the message sizes should be multiples of byte. Also note that the input message at CX's side is the output message at KUKA's side, and same with the other way round.

• KUKA KRC4 and Beckhoff PLC Communication

Ok. So, we have the controller connected to the X1 port of the EL6692 but the X2 is empty because the bridge is installed to the CX PC, so it's using the CX's internal bus.

• KUKA KRC4 and Beckhoff PLC Communication

Can't remember if we had the KRC4 connected to the X1 or X2, but will check that tomorrow.

• KUKA KRC4 and Beckhoff PLC Communication

KRC4 is unable to communicate directly with the CX embedded PCs, because there's no way to configure them to KUKAs EtherCAT bus. That's why you need the EL6692 between the CX and KUKA which can be configured at both sides.

• KUKA KRC4 and Beckhoff PLC Communication

EK1100 is not necessary. You can connect the EL6692 directly to the X44 interface.

• Help upgrading KR350 for Virtual Reality Ride!

Yes it's a real time control of the robot in pretty much any way you want. And I really don't see any other way you could properly sync the virtual environment and the robot motions, but maybe someone else does.

• Help upgrading KR350 for Virtual Reality Ride!

If you want to control the motion from the remote computer, I would suggest you to consider using RSI.

• Kinematic Calibration using vision

I'm not sure if it's possible to kinematically calibrate the robot with cameras, but you could use them to map the kinematic inaccuracies in the robot's cartesian space. For example program the robot to move along square or circle path, track the tcp's path with the cameras, and calculate inaccuracies from the results.