February 17, 2019, 07:59:55 AM
Robotforum | Industrial Robots Community

 visual servoing with kuka iiwa

Author Topic:  visual servoing with kuka iiwa  (Read 332 times)

0 Members and 1 Guest are viewing this topic.

January 07, 2019, 08:43:24 AM
Read 332 times
Offline

hsadeghian


Dear All,
I am going to use kuka iiwa for a visual servoing scenario.
Here is my plan...
I have connected a camera -mounted on flange, to  a remote computer to do all the image processing on the remote computer. on the robot computer side, I am going to use smart servo.. based on the provided sample program "SmartServoSampleSimpleCartesian".
The remote computer is going to receive the information (for instance pose of the robot) from the robot and send proper commands back to the java program over a UDP connection.

essentially, Is that a proper plan to do that?
How can I establish this connection?
Do I have to use KLI for client-server programming?

Thanks in advance.

Today at 07:59:55 AM
Reply #1

Advertisement

Guest

January 14, 2019, 09:56:02 AM
Reply #1
Offline

g.dechambrier



January 15, 2019, 02:28:24 PM
Reply #2
Offline

DrG


Hello hsadeghian,

Quote
Here is my plan...
I have connected a camera -mounted on flange, to  a remote computer to do all the image processing on the remote computer. on the robot computer side, I am going to use smart servo.. based on the provided sample program "SmartServoSampleSimpleCartesian".
The remote computer is going to receive the information (for instance pose of the robot) from the robot and send proper commands back to the java program over a UDP connection.

essentially, Is that a proper plan to do that?

that sounds like a proper plan.
Quote
How can I establish this connection?

In the KUKA Sunrise Controller, there are the TCP/IP and UDP Ports 30000-30010 reserved for running Jobs like those. That means: If you open a socket on one of those ports, you can run the communication to the external Computer right via KLI.
You will find many tutorials about "How to UDP/TCP/IP Sockets with Java" in the Internet.

DrG

PS: Some practical hints:
  • It is essential to know exactly where the Robot was located, as the picture was taken.

    Advice: sample the robots position data "short BEFORE" exposure as a proper estimate
    (much better than using the Position AFTER the image is read into the main memory of the image processing cpu - especially if you use WebCams via USB)
  • Dont Forget: This Scenario is Feedback control (!) - that means, since the camera is Robot mounted, any movement of the Robot has consequences to the Image - and every Image has consequences on the Robots movement...
    ... in fact: the control Loop is very likely to become unstable - or will show limit cycles
    In my case, this happened to me right that very instance, as my Boss hat the first time view on the new application  :icon_wink:. The Robot tracked the target nicely - the target stopped, for whatever reason, and the robot moved in limit cycles with an amplitude ~2-3 cm

January 21, 2019, 06:44:19 AM
Reply #3
Offline

hsadeghian


I appreciate for your informative information,
I try to use iiwa_stack as well.

one other problem I just noticed:
in smartServoSampleSimpleCartesian example a sinusoidal motion is set for the TCP in Z direction. but looking at the motion of the robot you can find that it does not follow exactly the given trajectory. It is supposed to move in only z direction but the robot moves in x and z direction not precisely.
Code: [Select]
try
        {
            // do a cyclic loop
            // Do some timing...
            // in nanosec
            double omega = FREQENCY * 2 * Math.PI * 1e-9;
            long startTimeStamp = System.nanoTime();
            for (i = 0; i < NUM_RUNS; ++i)
            {
                // Insert your code here
                // e.g Visual Servoing or the like
                // Synchronize with the realtime system

                theServoRuntime.updateWithRealtimeSystem();

                // Get the measured position
                Frame msrPose = theServoRuntime
                        .getCurrentCartesianDestination(_toolAttachedToLBR.getDefaultMotionFrame());

               
                // Do some Computation
                // emulate some computational effort - or waiting for external
                // stuff
                ThreadUtil.milliSleep(MILLI_SLEEP_TO_EMULATE_COMPUTATIONAL_EFFORT);

                // do a cyclic loop
                curTime = System.nanoTime() - startTimeStamp;
                double sinArgument = omega * curTime;

                // compute a new commanded position
                Frame destFrame = aFrame.copyWithRedundancy();
                double offset = AMPLITUDE * Math.sin(sinArgument);
                destFrame.setZ(destFrame.getZ() + offset);

               theServoRuntime.setDestination(destFrame);
            }
        }
       


I increased the "MILLI_SLEEP_TO_EMULATE_COMPUTATIONAL_EFFORT" to let the controller reach to the given destination. Yet no change is observed. I also increased the values   
Code: [Select]
aSmartServoMotion.setJointAccelerationRel(1.0);
aSmartServoMotion.setJointVelocityRel(1.0);
     
     but not special improvement is seen.
WHY?
Thanks
« Last Edit: January 21, 2019, 06:45:51 AM by hsadeghian »

January 21, 2019, 03:35:53 PM
Reply #4
Offline

DrG


Hello Hsadeghian,

Quote
one other problem I just noticed:
in smartServoSampleSimpleCartesian example a sinusoidal motion is set for the TCP in Z direction. but looking at the motion of the robot you can find that it does not follow exactly the given trajectory. It is supposed to move in only z direction but the robot moves in x and z direction not precisely.

Well - if my Memory serves me right - the example is based on the "SmartServo" not the "SmartServoLIN" Motion.
Therefore the Interpolation is computed Joint wise -> "the motion model behaves PTPish". Therefore your reported behaviour is explainable - and correlates to the expectations. If you like to run straight lines, you need to switch to "SmartServoLIN", which performs a Cartesian interpolation.

DrG

January 23, 2019, 05:00:35 AM
Reply #5
Offline

hsadeghian


Thanks,
I also tried the smartServoLIN sample. It is more precise but yet much difference exist between the commanded trajectory and real one... even by increasing the acceleration and velocity variables!
ServoMotion.setMaxTranslationAcceleration();
ServoMotion.setMaxTranslationVelocity(…)

I am wondering how the commanded camera velocity out of visual servoing algorithm can be realized. The stability of the systems are usually shown assuming perfect tracking of the camera velocities. Thus how the stability is preserved!?

January 23, 2019, 11:13:41 AM
Reply #6
Offline

DrG


Hello again,
Quote
I am wondering how the commanded camera velocity out of visual servoing algorithm can be realized. The stability of the systems are usually shown assuming perfect tracking of the camera velocities. Thus how the stability is preserved!?

Well, as you already stated, right there is the incorrect assumption:
In the feedback control Loop of a "real world" Visual Servoing,
you need to consider (model) the transferfunctions of ALL participants, especially that one of the robot (and it's interpolator). As a simple model, you could model the robot - due to its inertia - at least as a PT2 element. (Proportional gain with second order delay) (Wikipedia link -sorry in German, but the Pictures are nice: https://de.wikipedia.org/wiki/PT2-Glied   
an english link https://hackaday.io/page/4829-identification-of-a-damped-pt2-system)

In fact: it is more an issue of "control loop performance", not so much about "stability", as you design your visual servoing Controller...
... since the feedback gains are required to fit the true loop > which results in lower feedback-gains (provide stability) or better models to increase the gains again (improve performance)...

DrG

PS: I just can repeat my practical hints of an earlier post:

It is essential to know exactly where the robot was located, as the picture was taken.
Advice: sample the robots position data "short BEFORE" exposure as a proper estimate
(Hint: go for the measured position)

Dont Forget: This Scenario is Feedback control (!) - that means, since the camera is Robot mounted, any movement of the Robot has consequences to the Image - and every Image has consequences on the Robots movement...
... in fact: the control Loop is very likely to become unstable - or will show limit cycles

Today at 07:59:55 AM
Reply #7

Advertisement

Guest


Share via facebook Share via linkedin Share via pinterest Share via reddit Share via twitter

xx
Visual servoing use KRC5 and Scilab

Started by Dzseno on KUKA Robot Forum

7 Replies
3473 Views
Last post May 20, 2014, 08:20:04 AM
by Dzseno
xx
KUKA work visual Signal editor and machine.dat

Started by knowledgesharing on KUKA Robot Forum

1 Replies
833 Views
Last post October 13, 2017, 01:59:48 PM
by panic mode
xx
Motoman DX100 servoing

Started by AlMol on Yaskawa Motoman Robot Forum

1 Replies
2330 Views
Last post December 11, 2014, 06:50:43 AM
by sixdegreesofconfusion
moved
MOVED: How to communicate with KUKA iiwa 14 R820 without using KUKA sunrise

Started by SkyeFire on KUKA Robot Forum

0 Replies
268 Views
Last post July 13, 2018, 12:49:24 PM
by SkyeFire