1. Home
    1. Dashboard
    2. Search
  2. Forum
    1. Unresolved Threads
    2. Members
      1. Recent Activities
      2. Users Online
      3. Team Members
      4. Search Members
      5. Trophys
  3. Articles
  4. Blog
  5. Videos
  6. Jobs
  7. Shop
    1. Orders
  • Login or register
  • Search
This Thread
  • Everywhere
  • This Thread
  • This Forum
  • Articles
  • Pages
  • Forum
  • Blog Articles
  • Products
  • More Options
  1. Robotforum - Support and discussion community for industrial robots and cobots
  2. Forum
  3. Industrial Robot Support and Discussion Center
  4. Yaskawa Motoman Robot Forum
Your browser does not support videos RoboDK Software for simulation and programming
Visit our Mainsponsor
IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Sponsored Ads

How can I get the TCP position(Cartesian) in coordinate system

  • Jonson
  • June 5, 2018 at 3:18 AM
  • Thread is Resolved
  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 5, 2018 at 3:18 AM
    • #1

    Hello everyone,

    My robot system like shown bellow.
    ->DX200
    ->R1+S1(2 Axes)
    ->SETUP - GRP COMBINATION - R1+S1:S1 finished
    ->ROBOT - ROBOT CALIB finished : the TCP can track the axes in SYNCRO mode

    My question is how can I get the Cartesian coordinates in the coordinate system.
    Now I can find the current position in ROBOT - CURRENT POSITION, but the R1 and S1 are not combined to a coordinate system.
    If I move the robot, (X, Y, Z, Rx, Ry, Rz) will be changed, if I move the external rotation axes, (S1:1, S1:2) will be changed.
    However, I need to get a position(X, Y, Z, Rx, Ry, Rz) include the R1 and S1.
    With KUKA robot, I can get this position in the WORLD coordinate system, the if the robot is shifted, the XYZ will be changed, if the external axes are rotated, the XYZABC will be changed.

    I don't know how to get what I want, is there a INFORM or a Motoplus API can supply this? Or I need another option?

    Thank you.
    Kind regards,
    Jonson

  • Motouser
    Reactions Received
    39
    Trophies
    4
    Posts
    288
    • June 5, 2018 at 8:37 AM
    • #2

    I think that you need a command in INFORM:

    GETS $PX001 $PX000 //this instruction gets the coordinate in pulse mode and store it in P001(for example)
    GETS $PX001 $PX001 //this instruction gets the coordinate in CARTESIAN mode and store it in P001(for example)

    The difference (as you can see above) is the second variable PX000(for pulse mode), PX001(CARTESIAN)

    When you have an external axis the GETS instruction will work the same but will write also the EX00x variable.

    Ex.

    GETS $PX010 $PX001

    the variable P010 and EX010 will be the storage variable of your cartesian position for R1(P010) and S1(EX010).

    Edited once, last by Motouser (June 5, 2018 at 8:40 AM).

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 6, 2018 at 3:01 AM
    • #3
    Quote from Motouser


    I think that you need a command in INFORM:

    GETS $PX001 $PX000 //this instruction gets the coordinate in pulse mode and store it in P001(for example)
    GETS $PX001 $PX001 //this instruction gets the coordinate in CARTESIAN mode and store it in P001(for example)

    The difference (as you can see above) is the second variable PX000(for pulse mode), PX001(CARTESIAN)

    When you have an external axis the GETS instruction will work the same but will write also the EX00x variable.

    Ex.

    GETS $PX010 $PX001

    the variable P010 and EX010 will be the storage variable of your cartesian position for R1(P010) and S1(EX010).

    Display More


    But I just want get one coordinate in Cartesian, if there is a L type positioner, I need get (X Y Z Rx Ry Rz) without S1:1 and S1:2, but with the S1:1 into Rz and S1:2 into Ry.

    Ex.

    With KUKA robot, if I have a L type positioner, I calibrate the external axes use Base[32], then in the Base[32] coordinate system, if I move the 8th axis and keep the robot stop, I can find that the robot position in the Base[32] is changed.

    Thank you for your reply.


    Jonson

  • Motouser
    Reactions Received
    39
    Trophies
    4
    Posts
    288
    • June 6, 2018 at 8:31 AM
    • #4

    You could create this routine:

    GETS PX090 $PX001
    'now you have PX090(ROBOT CART. POSITION), EX090 (S1 POSITION)

    GETE D000 EX090(1)
    SETE P090(4) D000
    'you'll put tha value of S1(1) in Rx
    'I don't remember if iyou can do this directly: SETE P090(4) EX090(1)

    GETE D001 EX090(2)
    SETE P090(5) D001
    'you'll put tha value of S1(2) in Ry

    SETE P090(6) 0
    'erase the Rz value

    Edited once, last by Motouser (June 6, 2018 at 8:37 AM).

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 6, 2018 at 9:04 AM
    • #5
    Quote from Motouser


    You could create this routine:

    GETS PX090 $PX001
    'now you have PX090(ROBOT CART. POSITION), EX090 (S1 POSITION)

    GETE D000 EX090(1)
    SETE P090(4) D000
    'you'll put tha value of S1(1) in Rx
    'I don't remember if iyou can do this directly: SETE P090(4) EX090(1)

    GETE D001 EX090(2)
    SETE P090(5) D001
    'you'll put tha value of S1(2) in Ry

    SETE P090(6) 0
    'erase the Rz value

    Display More

    Firstly, thank you for your help.
    Your code make the S1:1 and S1:2 into Rx and Ry. But I need a combination of two coordinate system.
    You can see the attachment picture, the robot keep to stop, and the 8th axis rotate about 45°, you will find the TCP in the sync. coordinate system is change as the green line shown. You can consider the green line is the vector from the origin of this coordinate system to the TCP.
    We assume that before moving the 8th axis, the position is X=100,Y=0,Z=0,Rx=0,Ry=0,Rz=0, after moving the 8th axis about 45°, we can also consider that we rotate the 8th axis aroud the Z axis. So the position after moving is X=-100/sqrt(2),Y=100/sqrt(2),Z=0,Rx=0,Ry=0,Rz=45.
    So you can find the position in the sync. coordinate system in Cartesian is not equal to the result of directly adding.

    Thank you.

    Jonson

    Images

    • AttachmentPicture.png
      • 10 kB
      • 1,079 × 293
      • 24

    Files

    AttachmentPicture.png_thumb 5.09 kB – 329 Downloads
  • Motouser
    Reactions Received
    39
    Trophies
    4
    Posts
    288
    • June 6, 2018 at 12:11 PM
    • #6

    Just a question: but you have a Master tool frame (MTF) for the master robot?

    Edited once, last by Motouser (June 7, 2018 at 9:04 AM).

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 7, 2018 at 5:06 AM
    • #7
    Quote from Motouser


    Just a question: but you have a Master tool frame (MTU) for the master robot?

    Maybe I can see the Master Tool, but I have no idea how to setup it and where to find it. So, any relationship with this?

  • Motouser
    Reactions Received
    39
    Trophies
    4
    Posts
    288
    • June 7, 2018 at 8:59 AM
    • #8

    OK. Let me explain.

    I have an R1+S1 robot (so I have only a rotation axis not two like you, but I think that will worl the same).

    I have also a MTF (unfortunately it's a paid option and I have setted it instead of the User Frame 1, so when I rotate the ex. axis I see here the TCP change), this frame gives me the position of the robot respect the rotation of the axis, so if I keep the robot in a point (X,Y,Z,Rx,Ry,Rz) and I rotate only the axis, the point's coordinate change (like the picture in the attachments).

    So, geometrically, you have a vector from the MTU to your TCP.

    I don't know if there's a $PX00x usable for the MTU, but you can get the position in cartesian and the convert it in MTU (it's like you are using a matrix rotation).

    Example
    GETS PX090 $PX001
    'now you have PX090(ROBOT CART. POSITION), EX090 (S1 POSITION)

    CNVRT PX091 PX090 MTF
    'now you have your point referenced to the MTF

    For the position of the ex. axis you can do as I say before.

    I hope this is what you need.

    Edited once, last by Motouser (June 7, 2018 at 9:05 AM).

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 7, 2018 at 11:35 AM
    • #9
    Quote from Motouser


    OK. Let me explain.

    I have an R1+S1 robot (so I have only a rotation axis not two like you, but I think that will worl the same).

    I have also a MTF (unfortunately it's a paid option and I have setted it instead of the User Frame 1, so when I rotate the ex. axis I see here the TCP change), this frame gives me the position of the robot respect the rotation of the axis, so if I keep the robot in a point (X,Y,Z,Rx,Ry,Rz) and I rotate only the axis, the point's coordinate change (like the picture in the attachments).

    So, geometrically, you have a vector from the MTU to your TCP.

    I don't know if there's a $PX00x usable for the MTU, but you can get the position in cartesian and the convert it in MTU (it's like you are using a matrix rotation).

    Example
    GETS PX090 $PX001
    'now you have PX090(ROBOT CART. POSITION), EX090 (S1 POSITION)

    CNVRT PX091 PX090 MTF
    'now you have your point referenced to the MTF

    For the position of the ex. axis you can do as I say before.

    I hope this is what you need.

    Display More


    Yes, exactly what I want. But where can I find MTU and how can I calibrate this frame like a User frame.
    With KUKA robot, when I calibrate the SYNC. coordinate system, it will let me to specified a BASE[X] to save the frame. But with motoman, the MTU will be created and saved automatically?

    Thank you Motouser, you help me a lot.

    BR,
    Jonson

  • Motouser
    Reactions Received
    39
    Trophies
    4
    Posts
    288
    • June 7, 2018 at 12:14 PM
    • #10

    Hi,
    as I said it's paid option and the fact that you have a DX200 doesn't help.

    If you go at https://www.motoman.com/manuals/ you can dowload (maybe you've it) the INDEPENDENT/COORDINATED CONTROL FUNCTION, you'll find a chapter called "Calibration between Manipulator and Station".

    These instructions are the first step to create the MTF and then there is routine to set it properly ( I mean in a User Frame).

    Two problems:

    1. It's a paid option (are involved FD parameters)
    2. I've done this procedure for a DX100 (I think that there isn't difference but I'm not sure)

    I suggest you to contact Yaskawa, maybe if you already created a master tool and the FD are setted, you need a simply program to put the MTF in a User Frame.

    Sorry

    Edited once, last by Motouser (June 7, 2018 at 12:17 PM).

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 8, 2018 at 2:17 AM
    • #11
    Quote from Motouser


    Hi,
    as I said it's paid option and the fact that you have a DX200 doesn't help.

    If you go at https://www.motoman.com/manuals/ you can dowload (maybe you've it) the INDEPENDENT/COORDINATED CONTROL FUNCTION, you'll find a chapter called "Calibration between Manipulator and Station".

    These instructions are the first step to create the MTF and then there is routine to set it properly ( I mean in a User Frame).

    Two problems:

    1. It's a paid option (are involved FD parameters)
    2. I've done this procedure for a DX100 (I think that there isn't difference but I'm not sure)

    I suggest you to contact Yaskawa, maybe if you already created a master tool and the FD are setted, you need a simply program to put the MTF in a User Frame.

    Sorry

    Display More

    Hey, thank you, maybe I have finished all you said. I can find a Master tool in the P variable. But I need to have a test if the coordinates are changed as what I want.
    So, thank you very much.

    BR,
    Jonson

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 14, 2018 at 2:59 AM
    • #12

    Hey Motouser,

    I find another problem that is I can get the position in MTF in Motoplus, but I can not transform a position(x y z rx ry rz) from MTF to BF or RF.
    For example, there is vector (x=5,y=5,z=5,rx=0,ry=0,rz=0) in the MTF, how can I transform it to the BF or RF.
    If this can be done in the Motoplus, I think that is very nice.

    Thank you,

    BR,
    Jonson

  • Motouser
    Reactions Received
    39
    Trophies
    4
    Posts
    288
    • June 14, 2018 at 8:40 AM
    • #13

    I don't have Motoplus but I imagine that works the same as a real robot...so when you say:

    Quote from Jonson


    but I can not transform a position(x y z rx ry rz) from MTF to BF or RF.


    you mean that CNVRT doesn't work?
    But you have already the posiiton on BF/RF just with a GETS PX0xx $PX001, this will always return the position refereced to the base frame (in a real robot) expressed in cartesian.

    Geometrically the point expressed in BF is indipedent of the rotation of the external axis. I don't understand why you have to convert it back form the MTF, maybe I miss something.

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 14, 2018 at 9:48 AM
    • #14

    I want to correct the robot path in real time, but I can only calculate the bias in the MTF. And in the Motoplus, there isn't a API to correct the robot path in MTF, but in BF or RF. So, I need transform them from MTF back to BF or RF.

  • Motouser
    Reactions Received
    39
    Trophies
    4
    Posts
    288
    • June 14, 2018 at 11:31 AM
    • #15

    My knowledge of MotoPlus is very limited ( I just have the manuals and I studied it in the past, but I never programmed in this environment). if you don't have API to correct the path nor to convert a point in another reference frame, I think that you have to create a code by yourself but I don't know how.

    P.S.

    Quote from Jonson


    I want to correct the robot path in real time

    :icon_eek: :icon_eek: :icon_eek: :icon_eek: :REALLY???
    Just curiosity

  • Kswitz
    Reactions Received
    1
    Trophies
    3
    Posts
    20
    • June 14, 2018 at 2:04 PM
    • #16

    what would be the input to cause the path to correct in real time? Thru arc sensing, laser ?

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 15, 2018 at 4:20 AM
    • #17
    Quote from Motouser


    My knowledge of MotoPlus is very limited ( I just have the manuals and I studied it in the past, but I never programmed in this environment). if you don't have API to correct the path nor to convert a point in another reference frame, I think that you have to create a code by yourself but I don't know how.

    P.S.

    :icon_eek: :icon_eek: :icon_eek: :icon_eek: :REALLY???
    Just curiosity


    I also have no idea how to do. :wallbash:

    In the Motoplus you can correct the robot path every interpolation cycle which is about 4ms. I think you can find in the Reference Manual of the Motoplus.

    Thank you a lot.

    BR,
    Jonson

  • Jonson
    Reactions Received
    2
    Trophies
    4
    Posts
    211
    • June 15, 2018 at 4:24 AM
    • #18
    Quote from Kswitz


    what would be the input to cause the path to correct in real time? Thru arc sensing, laser ?

    Both are OK. As long as you can supply the sensor data every interpolation cycle.

  • JuGoLaTe
    Trophies
    3
    Posts
    1
    • January 30, 2021 at 9:44 AM
    • #19

    Hello Jonson!

    I am looking for this function. Can you maybe share the source code with me? It would help me a lot.

    Best regards.

    Hans

Advertising from our partners

IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Advertise in Robotics
Advertise in Robotics

Job Postings

  • Anyware Robotics is hiring!

    yzhou377 February 23, 2025 at 4:54 AM
  • How to see your Job Posting (search or recruit) here in Robot-Forum.com

    Werner Hampel November 18, 2021 at 3:44 PM
Your browser does not support videos RoboDK Software for simulation and programming

Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • krc5
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000

Thread Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • krc5
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000
  1. Privacy Policy
  2. Legal Notice
Powered by WoltLab Suite™
As a registered Member:
* You will see no Google advertising
* You can translate posts into your local language
* You can ask questions or help the community with your knowledge
* You can thank the authors for their help
* You can receive notifications of replies or new topics on request
* We do not sell your data - we promise

JOIN OUR GREAT ROBOTICS COMMUNITY.
Don’t have an account yet? Register yourself now and be a part of our community!
Register Yourself Lost Password
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on Google Play
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on the App Store
Download