Posts by Abel Wong

    My question is the data that needs to be sent to the robot, for example


    The current position of the robot, Z, is 1000mm. Increase the target position of 5mm every 12ms, i.e. Z_ target(12ms)=1005

    Z_ target(24ms)=1010

    Z_ target(36ms)=1015

    Z_ target(48ms)=1020

    ……


    It has been tested and verified that the #RELATIVE mode is adopted, and the data sent is

    Z_ correction(12ms)=5

    Z_ correction(24ms)=5

    Z_ correction(36ms)=5

    Z_ correction(48ms)=5


    Now, if sensor-guided motion based on absolute values is adopted for correction, the correction data will be sent

    Z_ correction(12ms)=5

    Z_ correction(24ms)=10

    Z_ correction(36ms)=15

    Z_ correction(48ms)=20

    ……


    Or send data

    Z_ correction(12ms)=1005

    Z_ correction(24ms)=1010

    Z_ correction(36ms)=1015

    Z_ correction(48ms)=1020


    According to the pictures in the software documentation, I understand that the difference between #ABSOLUTE and #RELATIVE is that the starting point of sending data calculation is different.

    hmmm... it looks like your requirements are changing. i am not sure what you are really after.

    using RSI you can send corrections to robot as well as read robot position. btw you can use fast option where RSI is operating on 4ms interval rather than default 12ms. but

    so what exactly seem to be the problem? what is stopping you from doing what you want?

    Choose IPO instead of IPO_ Fast, because in IPO mode, there is already a median filter to ensure the smooth motion of the robot.

    Requirements:

    Data acquisition: infrared camera captures moving targets

    Motion compensation: control the robot motion through movecorr(). Realize the relative static motion between the robot and the moving target


    Therefore, we need to predict the robot position according to the total delay time (acquisition time delay, data operation delay and robot motion delay)

    if you are interested in verifying timing, simply use WireShark. it gets every message timestamped with high accuracy. in my experience the robot will send messages at defined interval (12ms or 4ms). the response to KRC from external system may vary and this is not because of channel but rather computation or processing on that remote end. delay here is what i would focus on as this can get RSI tripped...

    Compared with timing, I want to know the time from the host sending the offset position to the robot completing the offset motion. The robot uses the movecorr () command to control the robot motion

    I assume you're using RSI in XML-over-TCP/IP mode?


    There really shouldn't be any delay. RSI will query the server every 12ms, without fail. Any packet delays or losses will cause RSI to throw a fault.

    MoveCorr() was used to control robot motion. After the robot sends the position and time stamp to the host computer, and the relative offset sent by the host computer is received within 12ms. Will this offset be executed within 12ms this time or within the next 12ms??

    Hello, guys


    We want to use the time of the host computer and the time stamp in the RSI software packet to calculate the communication delay, and use the prediction algorithm to calculate the delayed robot position to improve the accuracy of our application. So how to synchronize the time, calculate the time stamp, or even test the delay?


    The KSS version is 8.6.6, while the RSI version is 4.1.

    喂,伙计们


    我对库卡机器人包了一些问题,我想知道监视空间数据由safeoperation软件设置的具体存放位置?


    控制器系统为kss 8.6,安全操作版本为3.5!

Advertising from our partners