Author Topic: how can ı calculate angle between camera coordinate system and robot base  (Read 4150 times)

Offline mckmk03

  • Newbie
  • *
  • Thank You
  • -Given: 0
  • -Receive: 0
  • Posts: 7
hi.l attached the picture about my question. for vision guided robotic system camera is on kuka robot hand.l must calculate rotation transformation to robot base from camera coordinate.how can we calculate the angle between camera coordinate system and robot base coordinate.
thank you so much.

Offline rzapo

  • Sr. Member
  • ****
  • Thank You
  • -Given: 53
  • -Receive: 19
  • Posts: 400
  • Yaskawa Robotics Agent for Romania
    • SAM Robotics
from your drawing you must use cos(alfa)...

Offline mckmk03

  • Newbie
  • *
  • Thank You
  • -Given: 0
  • -Receive: 0
  • Posts: 7
yes l must use cos(alfa) but how can find alfa angle.     

Offline Robo Guru

  • Hero Member
  • *****
  • Thank You
  • -Given: 1
  • -Receive: 68
  • Posts: 664
Good question!  I'm not certain but I would like to know !!!

Offline Tilman

  • Full Member
  • ***
  • Thank You
  • -Given: 3
  • -Receive: 4
  • Posts: 103
    • Midi-Formatique
Hello mckmk03,


1) You know the stage coordinate system CStageW in World coordinates -> Thus X,Y,Z (the origine) and the rotation angles A,B,C (or similar, which give the orientation of the system). Right?

2) You know the camera coordinate system CCamW in World coordinates. You know them, because the camera is fixed on the robot arm, and you know the position and the orientation of the arm. Right?

3) With a homogenous coordination transformation, you can translate a point from one to another coordinate system. It consists of applying a rotation and a translation. http://en.wikipedia.org/wiki/Rotation_matrix explains how the rotation matrix is calculated by using the Euler angles. For Kuka robots, the geometric operator represents this transformation.

-> With CStageW you can calculate the matrix TStage that will transform the coordinates of a point given in the stage coordinate system into world coordinates.

-> With CCamW you can calculate the matrix TCam that will transform the coordinates of a point given in camera coordinate system into world coordinates.

4) If you want to express a point given in stage coordinates Ps into camera coordinates Pc, you have to calculate

Pc=Inv(TCam) TStage Ps.

Inv is here the inversed matrix (which is in this special case also the transposed matrix).

5) with the formula from 3), you can use the matrix M=Inv(TCam) TCam to get the rotation angles between the 2 coordinate systems.


Hope this helps,
Tilman/France

Offline Robo Guru

  • Hero Member
  • *****
  • Thank You
  • -Given: 1
  • -Receive: 68
  • Posts: 664
Tilman,

Anyways you might be able to explain this without applying it to a Kuka robot ???  I'm not familiar with what the CStageW commands do.

Also, what is stage co-ordinates??? Is this similar to a robot base frame co-ordinates on a Motoman?

Thanks,
RoboGuru

Offline Tilman

  • Full Member
  • ***
  • Thank You
  • -Given: 3
  • -Receive: 4
  • Posts: 103
    • Midi-Formatique
Hello RoboGuru,

Well, the problem is perhaps that I did not understand the problem.

I saw "stage coordinate system" in the picture from mckmk03, and in the text I read "robot base coordinate". I thought this would designate the same system. This is why I called it CStageW (thus a Cartesian coordinate system which is defined in the world coordinate system by giving the origine X,Y,Z and the orientation by the angles A=rotation around Z,B=rotation around Y and C=rotation around X).

Am I completely wrong ?

Greetings,
Tilman/France


DHawk

  • Guest
What type camera are you using?  Most camera's can calibrate automatically now by using a grid target.

Offline mckmk03

  • Newbie
  • *
  • Thank You
  • -Given: 0
  • -Receive: 0
  • Posts: 7
thanks you my friends for the answers.
Tilman ;

Quote
Tilman said : You know the stage coordinate system CStageW in World coordinates -> Thus X,Y,Z (the origine) and the rotation angles A,B,C (or similar, which give the orientation of the system). Right?

Quote
tilman said : 1)You know the camera coordinate system CCamW in World coordinates. You know them, because the camera is fixed on the robot arm, and you know the position and the orientation of the arm. Right?

1) yes l know stage coordinate system.l measure the stage and l know base coordinate system (x,y,z and rotation A,B,C) for stage.its right.

2) yes the camera (camare is Allied  and software is labview 2010) is fixed on the robot arm. but ı am not sure the camera coordinate system. in application do l need define a tool for camera. in application how can know camera coordinate system.is it right? l think l must define a tool for the camera in stage base system.

if l know the camera coordinate system , l know the angle between CstageW and CCamW.then l can calculate  the matrix TCam that will transform the coordinates of a point given in camera coordinate system into world coordinates.

or we can use calibration plate.which one is more reliable.define a tool for camera or using calibration plate.

thank you.
best regards.
« Last Edit: June 17, 2011, 12:50:06 PM by mckmk03 »

Offline Tilman

  • Full Member
  • ***
  • Thank You
  • -Given: 3
  • -Receive: 4
  • Posts: 103
    • Midi-Formatique
Hello mckmk03

Quote
or we can use calibration plate.which one is more reliable.define a tool for camera or using calibration plate.

A calibration of a camera consists of 2 things: the intrinsic parameters (focus, density of pixels, width and height) and the extrinsic parameters which "denote the coordinate system transformations from 3D world coordinates to 3D camera coordinates" [http://en.wikipedia.org/wiki/Camera_resectioning]

If I got you right, the camera is fixed on the arm. So it will permanently change his position. This means, that the extrinisic parameters change also all the time. Thus: if you use a calibration plate, it has to be done every time that your robot is in a certain position.

My idea was to calculate the extrinsic parameters yourself. You know the position where the camera is fixed on the arm, and the angles and the dimension of the robot. You can thus calculate the position and the orientation of the camera coordinate system. Perhaps there are internal variables of Kuka that can help (like $TFLWP). Else, it would be a matrix calculation [Denavit-Hartenberg].

Afterwards, if you know the position and the orientation of the camera coordinate system, you can use the geometrical operator and the INV_POS function from KUKA to transform the coordinates from the stage coordinate system to the camera system and vice versa.

Rests to calculate the projection on the CCD chip and eventually the distortion.

Greetings,
Tilman/France

Offline mckmk03

  • Newbie
  • *
  • Thank You
  • -Given: 0
  • -Receive: 0
  • Posts: 7
hi Tilman

l will try your suggestions.l will try to calculate camera coord system using with $TFLWP.so l research $tflwp and it is calcuated ofsett between flange point and an other point. so l think  l can find camera X,Y,Z,A,B,C sensitivity.

thank you so much.
Best Regards.

Offline mckmk03

  • Newbie
  • *
  • Thank You
  • -Given: 0
  • -Receive: 0
  • Posts: 7
Re: how can ı calculate angle between camera coordinate system and robot base
« Reply #11 on: December 06, 2011, 03:02:03 PM »
hi.  l can try to explain my problem on  the picture.

there is a zero pos for object in the image. and l move the object and  the new position of object is x=10 mm ,y=10mm and angle is 45 deg.

l try to convert the ofset for kuka robot base .
base_data[2] is robot base system
base_data[12] is temporary base system


frame target1_coord={x 0,y 0,z 0,a 0,b 0,c 0}
frame target2_coord={x 0,y 0,z 0,a 0,b 0,c 0}

target1_coord.x=x_ofset
target1_coord.y=y_ofset
target1_coord.a=x_ofset

target2_coord=INV_POS(target1_coord)
base_data[12]=base_data[2]:target2_coord

lin p2 base[12]


l convert camera ofset to base coordinate system but robot target is not correct.how can ı calculate correct ofset for robot base system??