Hi there!
I work with a KUKA KR 16 R2010 and I am having an issue with setting a base.
My task is to devise an algorithm that would process two sets of input points – one of points expressed in the coordinate system of the robot, and one of the same points expressed in the global coordinate system - and output a transformation between the two coordinate systems. This transformation would be defined by a set of XYZ-ABC values and would be used to ‘teach’ the robot a base that would enable it to move to points expressed in the global coordinate system. I managed to do that, but the problem is that the distance between the points where the robot is instructed to go and the points it actually goes to (the average Euclidian distance, which I will be referring to as ‘error’) is too large. The steps of my process are the following:
- I set the global coordinate system in a total station
- I move the robot to some arbitrary points and store their coordinates in the robot’s coordinate system
- I measure the position of each point with the total station and store its coordinates in the global coordinate system
- using Ordinary Least Squares regression, I obtain a translation matrix and a rotation matrix
- I compute the angles of rotation A, B, C with the following algorithm:
- After I obtain the angles A, B, C, I use them to compute a quaternion by following these steps (first image from attachment, or https://doc.rc-cube.com/v21.07/en/pose_format_kuka.html )
- I use the resulting quaternion to compute a new rotation matrix using this formula (second image from attachment, or https://doc.rc-visard.com/v21.07/en/pose…rotation-matrix)
- To test the transformation, I apply it to the points expressed in the global coordinate system and compare the result (now expressed in the robot coordinate system) with the points that were not transformed (which were originally expressed in the robot’s coordinate system, stored when the robot was moved, not measured with the total station).
sb = R(3, 1);
cb = sqrt(1 - sb^2);
ca = R(1, 1);
sa = -R(2, 1);
cc = R(3, 3);
sc = -R(3, 2);
A = atan2(sa, ca) * -180 / pi;
B = atan2(sb, cb) * -180 / pi;
C = atan2(sc, cc) * -180 / pi;
where R is the rotation matrix
If the next steps show that the transformation is accurate enough, these A, B and C would be passed to the robot (manually, via the HMI) together with the translation found at step 4 to teach it the base that would enable it to go to points expressed in the global coordinate system.
The error is calculated as the average Euclidian distance between points (third image from attachment). If the transform is good, they should be very close.
Right now, the average error that I get is 4.85 mm, and I would like to have an error of 2 mm or less.
Notes:
- From what I’ve seen, what I call ‘robot coordinate system’ is usually referred to as ‘world coordinate system’
- I mastered the robot, which made the error decrease from 5.67 mm to 4.85 mm
- I use 6 points; measuring more did not help, and we are looking for a solution that would work with only 4 points
- The total station measures in a left-handed coordinate system; I compensate for this by multiplying the y-coordinates with -1 before computing the transformation
- I assume that once the robot is given a transformation XYZ-ABC, it computes the quaternion and the rotation matrix the same way I do
- One thing I forgot to mention is that I don't work with KRL. I use a Matlab script which I will later use to create a TcCOM object and that's where the transformation will take place.
If you have any suggestions or know a resource that I could use to solve my issue, I would greatly appreciate it.
Thank you!