Here's an interesting question.
We're currently comissioning 2 ABB robots that use a 7th axis each (a track).
The cell uses a vision system to determine the correct position of the work object as it arrives in.
Now, the problem is such:
We use 4 cameras to determine the displacement of the work object and the rotation. The vision guy gave us the zero position coordinates for his camera system relative to a cell zero position that was measured previously. Each work object has its own origin for the measurement and we will be receiving X, Y, Z and RX, RY, RZ displacement offsets.
Using laser measuring, we've also determined the zero position for the work object for the two robots and created a WOBJ for each robot.
Now, where we're a bit stuck is this:
How do we define WOBJs for each work object origin now that we have a cell zero position on each robot? The robots are facing each other but their coordinate systems are rotated 180 deg one from the others.
I don't know if this is terribly clear as far as explanations go, but I'll only have access to the vision guy next week to pick his brains. I'd rather I get some trajectorie programmed this weekend so I won't have to go over them again later on when we're doing vision tests.