 # Math behind the 3-point method for TCP

• Does anyone know how the math behind KUKAs (and other robot vendors) 3-point method works? I found a paper that describes the method, however, the referenced formulas seem incomplete or at least I can't find a solution that works using the linear equation solver from SymPy and Mathematica.

In theory, you only need 2 points to solve for the XYZ of the TCP if you use the rotations of the taught poses, but my results are bogus.

• determining TCP requires four points. points are on the sphere and center of the sphere is TCP.

this was discussed several times, recently here:

RE: Calculating TCP

2) if you have an issue with robot, post question in the correct forum section... do NOT contact me directly

• determining TCP requires four points. points are on the sphere and center of the sphere is TCP.

this was discussed several times, recently here:

RE: Calculating TCP

Yes, I know. The 4-point (aka Center-of-sphere method) is well known. However, some manufacturers have 3-point method to determine the tool offset. I was asking if someone does know the math behind this.

Here is the solution for the 4-point method in Python, if anyone cares:

Code
``````def calc_sphere_center(wp1, wp2, wp3, wp4):
x1, y1, z1 = wp1.x, wp1.y, wp1.z
x2, y2, z2 = wp2.x, wp2.y, wp2.z
x3, y3, z3 = wp3.x, wp3.y, wp3.z
x4, y4, z4 = wp4.x, wp4.y, wp4.z
u = ((1/2)*(-((-y1 + y4)*((-x1 + x4)*(-z2 + z4) - (-x2 + x4)*(-z1 + z4)) - (-z1 + z4)*((-x1 + x4)*(-y2 + y4) - (-x2 + x4)*(-y1 + y4)))*(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(x3**2 - x4**2 + y3**2 - y4**2 + z3**2 - z4**2) - (x3 - x4)*(x1**2 - x4**2 + y1**2 - y4**2 + z1**2 - z4**2)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(x2**2 - x4**2 + y2**2 - y4**2 + z2**2 - z4**2) - (x2 - x4)*(x1**2 - x4**2 + y1**2 - y4**2 + z1**2 - z4**2))) + ((-y1 + y4)*((-x1 + x4)*(-x2**2 + x4**2 - y2**2 + y4**2 - z2**2 + z4**2) - (-x2 + x4)*(-x1**2 + x4**2 - y1**2 + y4**2 - z1**2 + z4**2)) - ((-x1 + x4)*(-y2 + y4) - (-x2 + x4)*(-y1 + y4))*(-x1**2 + x4**2 - y1**2 + y4**2 - z1**2 + z4**2))*(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(z3 - z4) - (x3 - x4)*(z1 - z4)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(z2 - z4) - (x2 - x4)*(z1 - z4))))/((x1 - x4)*((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(z3 - z4) - (x3 - x4)*(z1 - z4)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(z2 - z4) - (x2 - x4)*(z1 - z4)))))
v = ((1/2)*(-((x1 - x4)*(z2 - z4) - (x2 - x4)*(z1 - z4))*(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(x3**2 - x4**2 + y3**2 - y4**2 + z3**2 - z4**2) - (x3 - x4)*(x1**2 - x4**2 + y1**2 - y4**2 + z1**2 - z4**2)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(x2**2 - x4**2 + y2**2 - y4**2 + z2**2 - z4**2) - (x2 - x4)*(x1**2 - x4**2 + y1**2 - y4**2 + z1**2 - z4**2))) + ((x1 - x4)*(x2**2 - x4**2 + y2**2 - y4**2 + z2**2 - z4**2) - (x2 - x4)*(x1**2 - x4**2 + y1**2 - y4**2 + z1**2 - z4**2))*(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(z3 - z4) - (x3 - x4)*(z1 - z4)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(z2 - z4) - (x2 - x4)*(z1 - z4))))/(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(z3 - z4) - (x3 - x4)*(z1 - z4)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(z2 - z4) - (x2 - x4)*(z1 - z4)))))
w = ((1/2)*(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(x3**2 - x4**2 + y3**2 - y4**2 + z3**2 - z4**2) - (x3 - x4)*(x1**2 - x4**2 + y1**2 - y4**2 + z1**2 - z4**2)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(x2**2 - x4**2 + y2**2 - y4**2 + z2**2 - z4**2) - (x2 - x4)*(x1**2 - x4**2 + y1**2 - y4**2 + z1**2 - z4**2)))/(((x1 - x4)*(y2 - y4) - (x2 - x4)*(y1 - y4))*((x1 - x4)*(z3 - z4) - (x3 - x4)*(z1 - z4)) - ((x1 - x4)*(y3 - y4) - (x3 - x4)*(y1 - y4))*((x1 - x4)*(z2 - z4) - (x2 - x4)*(z1 - z4))))

return u, v, w``````

Note that only the XYZ coordinates of our waypoints are used. However, since we also have the orientation, we should be able to use this info to get the sphere center with fewer waypoints. My math approaches so far have not leaded to a working result. Maybe someone here knows if there is a paper or technical description of the math behind the process.

• I think you're right. Even though 2 FRAMEs doesn't fully constrain a sphere, there should be only one reasonably sized sphere that results in the same XYZ transform from frame to sphere center for both frames.

I'm going to think about this but hopefully someone has a neat algorithm for solving this on hand.

• Quote

Yes, I know. The 4-point (aka Center-of-sphere method) is well known. However, some manufacturers have 3-point method...

btw. this is a KUKA forum. why are you asking for support here if other robot brand is used? and why not mention the exact brand(s) so we can try to relate?

the terminology may be different too, you did not explain what "3-point method" is or how it is calculated.

when looking for an idea or principles to solve something, it makes little sense to just look at a dump of some complex code that represents working solution. analysis requires understanding of principles that lead to that solution. this also means that definitions are very important:

if the "point" is considered to be just a set of translation coordinates in space (without orientation) then all we have are X,Y,Z values.

two such points can only define line in space.

three such points can define plane, etc.

so to define TCP using this, one needs at minimum 4 points and this is what KUKA uses. in this case points are located on the sphere.

if by "point" you mean frame or larger structure so that it contains XYZ as well as orientation, then number of such elements ("points") can be lower than 4...

for example frame is representing translation and orientation (6DoF) and can be seen as over-constrained plane (rotation about normal vector is not important). so relaxed interpretation (plane) requires less data than contained in frame and that is ok since it means we do not need all DoF.

using that definition:

intersection of two planes is a line (infinite number of points).

intersection of three planes (could be) a point.

another option is to consider intersection of three normals (find minimum distance since they will not necessarily intersect)

and that is how i would start considering this...

2) if you have an issue with robot, post question in the correct forum section... do NOT contact me directly

I am sorry if this is not a good fit for this forum but I have solved the problem. It's a bit long-winded, but I've tested this and attached the code and it works quite well. In-fact I think its a bit more elegant than solving by sphere fitting.

two point TCP definitely isn't possible. As panic mode suggested, executing this approach with only two samples results in an infinite number of solutions, all falling on a line.

A TCP can be generated given only three samples. Each sample must contain the A6 flange position and rotation in order to derive the TCP. This is done by solving a system of linear equations, not by fitting points to a sphere, which would require four samples.

Here is the math:

Unknowns

P = teaching center point

T = XYZ transform

Givens (three of each, one per sample)

O = sample coordinate (A6 flange XYZ position at given sample)

R = rotation matrix of sample origin (A6 flange rotation matrix at given sample)

3x3 rotation matrix R is in this form:

[xx xy xz]

[yx yy yz]

[zx zy zz]

A point P can be transformed by T in reference to an existing coordinate system with XYZ O and rotation matrix R:

Rxx*Tx + Rxy*Ty + Rxz*Tz + Ox = Px

Ryx*Tx + Ryy*Ty + Ryz*Tz + Oy = Py

Rzx*Tx + Rzy*Ty + Rzz*Tz + Oz = Pz

In this case, the three samples O, although in different positions and orientations,

should result in the same position P when transformed by an XYZ offset T:

R1xx*Tx + R1xy*Ty + R1xz*Tz + O1x = R2xx*Tx + R2xy*Ty + R2xz*Tz + O2x = R3xx*Tx + R3xy*Ty + R3xz*Tz + O3x

R1yx*Tx + R1yy*Ty + R1yz*Tz + O1y = R2yx*Tx + R2yy*Ty + R2yz*Tz + O2y = R3yx*Tx + R3yy*Ty + R3yz*Tz + O3y

R1zx*Tx + R1zy*Ty + R1zz*Tz + O1z = R2zx*Tx + R2zy*Ty + R2zz*Tz + O2z = R3zx*Tx + R3zy*Ty + R3zz*Tz + O3z

P is no longer a factor when equating the samples. Further reduce to put into terms of O,

and separate the T components:

R1xx*Tx + R1xy*Ty + R1xz*Tz + O1x + R3xx*Tx + R3xy*Ty + R3xz*Tz + O3x = 2*R2xx*Tx + 2*R2xy*Ty + 2*R2xz*Tz + 2*O2x

R1yx*Tx + R1yy*Ty + R1yz*Tz + O1y + R3yx*Tx + R3yy*Ty + R3yz*Tz + O3y = 2*R2yx*Tx + 2*R2yy*Ty + 2*R2yz*Tz + 2*O2y

R1zx*Tx + R1zy*Ty + R1zz*Tz + O1z + R3zx*Tx + R3zy*Ty + R3zz*Tz + O3z = 2*R2zx*Tx + 2*R2zy*Ty + 2*R2zz*Tz + 2*O2z

R1xx*Tx + R1xy*Ty + R1xz*Tz + R3xx*Tx + R3xy*Ty + R3xz*Tz - 2*R2xx*Tx - 2*R2xy*Ty - 2*R2xz*Tz = 2*O2x - O3x - O1x

R1yx*Tx + R1yy*Ty + R1yz*Tz + R3yx*Tx + R3yy*Ty + R3yz*Tz - 2*R2yx*Tx - 2*R2yy*Ty - 2*R2yz*Tz = 2*O2y - O3y - O1y

R1zx*Tx + R1zy*Ty + R1zz*Tz + R3zx*Tx + R3zy*Ty + R3zz*Tz - 2*R2zx*Tx - 2*R2zy*Ty - 2*R2zz*Tz = 2*O2z - O3z - O1z

Tx(R1xx + R3xx - 2*R2xx) + Ty(R1xy + R3xy - 2*R2xy) + Tz(R1xz + R3xz - 2*R2xz) = 2*O2x - O3x - O1x

Tx(R1yx + R3yx - 2*R2yx) + Ty(R1yy + R3yy - 2*R2yy) + Tz(R1yz + R3yz - 2*R2yz) = 2*O2y - O3y - O1y

Tx(R1zx + R3zx - 2*R2zx) + Ty(R1zy + R3zy - 2*R2zy) + Tz(R1zz + R3zz - 2*R2zz) = 2*O2z - O3z - O1z

Now form a system of equations A * x = B.

[R1xx - 2*R2xx + R3xx, R1xy - 2*R2xy + R3xy, R1xz - 2*R2xz + R3xz]*[Tx] = [-O1x + 2*O2x - O3x]

[R1yx - 2*R2yx + R3yx, R1yy - 2*R2yy + R3yy, R1yz - 2*R2yz + R3yz]*[Ty] = [-O1y + 2*O2y - O3y]

[R1zx - 2*R2zx + R3zx, R1zy - 2*R2zy + R3zy, R1zz - 2*R2zz + R3zz]*[Tz] = [-O1z + 2*O2z - O3z]

If working from python, you can use numpy.linalg.linalg.solve, or you can otherwise make a Gaussian elimination partial pivoting function. We are solving for x, which will return the X,Y,Z transform.

I have attached the solution in Python.

• I'm guessing this solution assumes the KUKA Euler sequence of Rz-Ry'-Rx", at least for the conversion between robot coordinates and matrix (at the matrix level, all the Eulers are the same).

How would you do this as a more general case?

• The only thing that depends on the angle values is the converter used to convert from angles to rotation matrix. In my Python code, the rot_to_mat function is specifically for Kuka's Tait-Bryan ZYX convention.

Everything else is universal. All of the math above uses only XYZ coordinates and rotation matrices.