KRC4 8.6.6

At present I don't have the real robot - I'm only using Work Visual 6.0 on a PC.

I'm trying to figure out the best way to automate the manual KUKA 3 point base calibration procedure using a simple binary sensor connected to a fast measuring input. I want to be able to capture my points whilst a nominal base and measuring tool are active, and to produce a correction frame which I can use to shift the nominal program to fit the actual location of my workpiece:-

New_Base = Nominal_Base** : **Correction_frame

I'm sure I've seen this discussed before but can't find the threads referred to as they may be obsolete: wherever it was mentioned, the links don't work anymore. This is from 2017: -

Display Moremath was discussed before

calculate TCP using four points (calculate center of sphere from 4 points)

https://www.robot-forum.com/robotforum/kuk…98699/#msg98699

calculate base using 3points

https://www.robot-forum.com/robotforum/kuk…94103/#msg94103

I've considered using TouchSense 3.2 as has been suggested elsewhere, but I'm not convinced that it will result in the correction frame that I need as it does with the manual 3 point method. This example program shown in 9.1 of the documentation is confusing as it doesn't show how the 3D correction is done on **lines 30-35**, as the points were made with BASE[0], not with a pre-defined workpiece base: -

QuoteDisplay More1 DEF Example( )

2 INI

3

4 PTP HOME Vel= 100 % DEFAULT

5

6 PTP P1 CONT Vel=100 % PDAT1 Tool[1]:Torch1 Base[0]

7 PTP P2 CONT Vel=100 % PDAT2 Tool[1]:Torch1 Base[0]

8 TouchSense SEARCH LIN P3 Vel=2 m/s CPDAT1 VIA P4 CD1

SP1 TOOL[1]:Torch1 BASE[1]:Table

9 PTP P5 Vel=100 % PDAT3 Tool[1]:Torch1 Base[0]

10

11 TouchSense Corr 1D CD1

12 TouchSense Check Point P8 Radius=3 mm

13

14 PTP P6 CONT Vel=100 % PDAT4 Tool[1]:Torch1 Base[0]

15 PTP P7 CONT Vel=100 % PDAT5 Tool[1]:Torch1 Base[0]

16 TouchSense SEARCH LIN P8 Vel=2 m/s CPDAT2 VIA P9 CD2

SP2 Tool[1]:Torch1 BASE[1]:Table

17

18 TouchSense Corr 2D CD1 CD2

19 TouchSense Check Point P13 Radius=3 mm

20

21 PTP P10 Vel=100 % PDAT6 Tool[1]:Torch1 Base[0]

22 TouchSense SEARCH LIN P11 Vel=2 m/s CPDAT3 VIA P12 CD3

SP3 Tool[1]:Torch1 BASE[1]:Table

23 PTP P13 Vel=100 % PDAT7 Tool[1]:Torch1 Base[0]

24

25 TouchSense Corr 3D CD1 CD2 CD3

26

27 TouchSense Check Point P17 Set=CP1

28 TouchSense Check Point P18 Radius=5 mm

29

30 PTP P14 CONT Vel=100 % PDAT8 Tool[1]:Torch1 Base[0]

31 LIN P15 CONT Vel=2 m/s CPDAT4 Tool[1]:Torch1 Base[0]

32 PTP P16 Vel=100 % PDAT9 Tool[1]:Torch1 Base[0]

33 LIN P17 Vel=0.2 m/s CPDAT5 Tool[1]:Torch1 Base[0]

34 LIN P18 Vel=0.2 m/s CPDAT6 Tool[1]:Torch1 Base[0]

35 PTP P19 CONT Vel= 100% PDAT10 Tool[1]:Torch1 Base[0]36

37 PTP HOME Vel= 100 % DEFAULT

38

39 END

How can this work? I can see that the 3 correction vectors CD1 CD2 and CD3 were determined using BASE[1]:Table, but the points to be corrected are executed using Base[0], not BASE[1]:Table, as one would expect. Is it being done using a Technology Function as with a TTS weave? I find the documentation confusing as it doesn't explain actually how the shift is done.

I understand the KUKA 3 point base calibration procedure uses some embedded dll code so there is no KRL to examine as with some other included functions. I'm sure if I had a real robot to play with, I could come up with a reasonable solution using instructions such as the geometric operator and INV_POS (...) for example, but I'm working from home at present, so proper development and tryout is difficult.

Can anyone point me to some KRL code as an example, so that I can understand how the task might be accomplished?

Thanks in advance!