Posts by Seulki

    The options that relate to Safety are "HRC" and "SafeOperation". If you're sure you have purchased them from your provider, it should be installed on the controller-you might want to check the stickers shown when you open up the cabinet-. Or if you're Sunrise Workbench has the option already, it could simply be done by activating those options with "StationSetup.cat - Software" on your project. And you need to do "Installation" once after changing any of StationSetup.

    Due to the decreased impact we have not found time and priority to do measurements or modifications on the sunrise https://github.com/IFL-CAMP/iiwa_stack) itself (I assume you mean code on the sunrise, in our calling code (fork of ) I didn't find the class or methods you mention)

    If you're sure iiwa_stack is running with FRI, then it has to be there since it's the first thing needed to create FRIsession. (Below is the applicationExample shown when activating FRI software option on stationSetup.cat)

    Code
    // configure and start FRI session
    FRIConfiguration friConfiguration = FRIConfiguration.createRemoteConfiguration(_lbr, _clientName);
    friConfiguration.setSendPeriodMilliSec(5);
    getLogger().info("Creating FRI connection to " + friConfiguration.getHostName());
    getLogger().info("SendPeriod: " + friConfiguration.getSendPeriodMilliSec() + "ms |"
    + " ReceiveMultiplier: " + friConfiguration.getReceiveMultiplier());
    FRISession friSession = new FRISession(friConfiguration);
    FRIJointOverlay jointOverlay = new FRIJointOverlay(friSession);


    - I have no background on ROS so my statement below probably be wrong.

    I briefly look on the source code on iiwa_stack.

    I could see it's using Servoing(smartServo) on it's SunriseApplication "ROSBaseApplication", but neither I found lines related to FRI.

    If it's using FRI, the application running on SunriseOS need to contain FRI stuff to be as FRIServer while the external devices communicating with Sunrise has to be FRIClient. If there's none of structures programmed as below, it's not using FRI.


    And if it's using Servoing, I guess you need to take a look at periods of timers or your own "controlLoop()"

    - How can I confirm which rate the Kuka is currently expecting

    - How can I change this setting?

    Hello, you might want to check out on

    FRIconfiguration class - must be initialized somewhere by the iiwa_stack (I guess?)

    and its methods like .setReceiveMultiplier() , .setSendPeriodMilliSec() - you need to check both and their explanation

    As far as I know, the real-time control options of LBR are both working with methods that we offer new target positions before it actually reach to its previously given target. So the interpolator does not perform exact stop and always moves toward the given renewed target. Thus it's mostly used for tracking things in real-time, with assistance of vision, simulator, control device, sensors or any of such that can give the robot positional information.


    If you can feed target positions from the external device(through ROS or tracking devices for a basketball) based on -nearly- real time, using FRI or Servoing(simpler and lower performance version real-time controll; does not require external server program with aid of rtOS, period for accepting new targets is more than aprx. 14ms) is a fine idea for me to think.

    The workaround we can think of is to start a movement towards some pose but never use the interpolator commands but use our own interpolation to move the robot to the desired goal. Once we are finished with what we want to do we can drive the robot to the initial goal that iiwa.move(...) wants to reach and stop the robot application.

    I'm not sure if I've understood your statement correct, but if you stick to conventional way of commanding motions, the robot will always stop after each motion. Unless you give the whole series of motion commands before starting the motion and make them as one Spline motion or blend them with moveAsync() / setBlending().

    You got different project settings(Software options) on both sides.


    1. You may want to download the project from the controller first, and then work on it.

    2. You look for those software options that are not matching between the project of your controller and of your workbench, and select those on your workbench that are missing.

    - Under [StationSetup.cat] - [Software] tab

    3. If you are sure that your project of your workbench is the right one you need, you need to "Install" the project before trying "Synchronize".

    - Under [StationSetup.cat] - [Install] tab. Installation will delete whatever was in the controller, and install software options that you selected on your workbench. Installation is needed only once, now you can just Synchronize.

    There always be slight offsets when you use Impedance control for there are inaccuracies of torque sensors, of mass data of the tool, and of innate kinematic issue on robot poses that enlarge tolerances of torque sensors.


    One, you can make use of additional positionMode motion right after impedanceMode motion.

    Code
    ObjectFrame tcp = ToolTest1.getFrame("/TCP");    // btw, it'd be easier I guess
    tcp.move(lin(frame2).setMode(...));
    tcp.move(ptp(frame2));    // will compensate the slight position offset. ptp is preferable than lin.


    Two, you can get offset and compensate it by another motion with this offset.


    However both of these will consume much more time than you wanted.

    No unfortunately it's not possible.


    One, you have to have each project for each cabinet, having different IP settings.


    Two, if you really mean to have exactly same project for both robots, install the project to one of the cabinet so both of the controller will have same IP. Then you can synchronize without changing project IP setting, but of course you need to unplug & plug an ethernet cable, can not use switching hub.

    I think i have to convert the gripper's interface into Ethernet, if i connect it to the X2 port. So I can directly control my device via the X66 port at the cabinet, if my device is connected to the X2 port?


    BTW, I have another question: is there any way to directly read and write the X3 port of the flange instead of using Sunrise?

    No and No.


    As I mentioned, wires from X2 to X66 are not simply wires. You can not modify any of interfaces from flange (except X74/75 to X76).


    Same goes for X3. they are coupled through EtherCAT coupler inside of the robot and only communication is done via controller-robot cable.

    You need to use Sunrise as an interface between your device and a robot application.

    Could be like
    Sunrise[robot application] - EtherCAT - X3 DIO - your device.


    Or if you don't like any of these idea, just use external cabling of your own, and tie them up on "KUKA" marked silver plates.

    Hello,


    First of all, X2 is an EtherCAT interface not an EtherNet. It is kind of already coupled inside of the robot and to the controller which means you can not hijack it in between its connection.


    Second, Robotiq gripper's EtherCAT interface is supported by LBR iiwa. You can simply import a device description file to Sunrise(WorkVisual is needed) system and control Robotiq gripper in your robot application-java code-.


    Third, If you want a simple direct wires on the flange, you have X74, X75 which are directly bypassing the robot body and going out through X76 at the base of the robot. You can make use of that interface on your own. It has capability of 2 power inputs and 6 (3 x paired and shielded) lines.


    Fourth, the last question regarding WV, is in chapter named "Bus Configuration". If you are looking at END user manual, ask an SI manual to your provider.


    Fifth, if your device's interface is converted into EtherNet, you probably want to use KLI interface on the controller, right above the power switch. Yes it's the port that you use when you do installation and synchronization. I personally think it will be the easiest way to do it if you're well aware of socket comm. programming. Or to use an EtherCAT(Second proposal) would be recommended.


    Regards,

    You could try override dispose() method in your backgroundTask.


    Code
    @Override
    public void dispose() {
    ServerSocketClose_();
    super.dispose();    // not so sure this is needed. just my habit.
    }


    And this dispose() method will be called when you press the stop button of the backgroundTask on HMI.

    - you could do it by manually pressing this stop button, or other way to directly call dispose() when you need. ("END" packet from a client or so)

    However, You still need to stop-call dispose()- it before synchronizing.

    I experienced a bug-ish reaction when I try synchronize project without manually stopping a background task.

    It seems the instance is not properly disposed if it the dispose() method hasn't been explicitly called.

    Hello JM,


    I think you can't do it with a one line code from a background task if you're not using Sunrise,StatusController option.

    However you can still make use of other things.


    You could get ITask instance of your Application class.

    Code
    --Your backgroundTask
    @Inject
    private ITaskManager taskManager;
    run() {
        ITask robotApp = taskManager.getTask(YourApplicationName.class);
        // use try-catch - You will get TaskNotAvailableException if your Application is not running.
    }

    You could also make use of ProccessData as a Flag to see if your Application is running.

    Hello,


    A simple way to make a function of "deactivating ESM" is making an ESM which has no extra effect of safety such as "Emergency stop Local".

    It is already set on KUKA-PSM, and by changing the ESM state to this one, it will work like there's no additional AMF. -But you will see two errors when Local E-stop is pressed-.


    Switch to KRF when ESM is violated. You will still be able to jog the robot and once the safety is insured, the mode will be automatically switched to T1.

    Hello,


    How about split a workspace for the Robot, and another for the Tool?

    I assume the monitored volume of the Robot is represented as spheres having their origins at the centers of each joint.

    Which might probably not taking consideration of our mountingBase which is a hundred-some mm below from the origin of Axis1.

    It couldn't really explain what exactly happened to your setting (-19 working, -80 not), but separating a Robot and a Tool for your space monitors might be a

    Hi,


    I assume that DataRecorder class should have a method that you can set a Tool for compensating its mass data such as "rec.setTool".

    Never tried DataRecorder but you could take a look at its JavaDoc then you might find some useful information hopefully.


    Also,

    You can intuitively get the values from a robot instance.

    Code
    robot.getExternalTorque()    // joint torques
    robot.getExternalForceTorque()    // Cartesian force/torque(wrench)
    // ex>
    double currentForceZ = robot.getExternalForceTorque(tcp).getForce().getZ();    // N
    double currentTorqueA7 = robot.getExternalTorque().getSingleTorqueValue( JointEnum.J7 );    // Nm

    It usually happens when the torque exerted on the axis has gone beyond its permissible limit.


    A6 has +/- 40Nm limit on its torque.

    You might want to evaluate your tool attached to the robot and probably change robot pose to relocate/disperse external loads on A6/A7 to other Axes which have bigger figures of torque limits.

    ptp() method has 3 or 4 variants for its parameters.

    The one you need to use is ptp(AbstractFrame) and the parameter type AbstractFrame refers to ObjectFrame and Frame classes.

    ObjectFrame; are the frames that you actually made with Workbench interface (P1, P2..). Teach them and just use them(drag&drop into editor).

    Frame; modifiable class that contains Cartesian position value. Can not have teaching information, only being used within your code like normal variables.


    In short, try this

    Code
    lBR_iiwa_7_R800.move(ptp(new Frame(pose.X, pose.Y, pose.Z, pose.Alpha, pose.Beta, pose.Gamma)));

    - Be careful if angle values are in Radian, and keep in mind that the Frame value is an absolute value from your World(0,0,0,0,0,0), not a relative value from the current position.

    - Code might not work before corrections. I just wrote the code without checking.

    - Cause the Frame can not have Status Turn values included in teaching information, the robot motion could be away from your expectation or not possible to reach.

    - And your A3 will remain still. 6 Cartesian values are not enough to have all 7 joint values through Inverse Kinematics.

    It seems that you probably have configured a faulty MediaFlange type for your robot.

    I/O Group should have MediaFlangeIOGroup unless you're using a MediaFlange without any I/O (and there are only 2 out of 10 variants).

    In the same sense, your X650-651 cable connection could be not stable.

    Or for the last, the Gripper connection could be the cause as well.

    -> According to your error "SION CIB SR", some node of your EtherCAT connection is malfunctioning.


    In the figure 2, try press a long bar where "HMI OK | 0 | (empty)" is displayed.

    You will see it something like "HMI disconnected" and then "HMI connecting" to "HMI connected" when you press it again.

    Then try turn the key switch back. You may have your HMI back.


    If you were not the one who somehow messed with the settings up, you should ask for a support to where you got the robot from.

    I think this is specific to the LBR MED.

    not all version of the Sunrise OS will allow this.


    External control has been offered for every type of LBR iiwa throughout every versions of SunriseOS.

    It's not even an option that you can choose not to have.

    Just configure it in Sunrise project setting and the actual communication(I/O or UDP), that's all.

    There could be a possibility that your robot's current position has to be considered.

    LIN literally means a motion from where the robot is now to the targeted position.

    Even if you can reach to the target with manual jogging, it might not be able to make a LIN line from your position to there if the path is crossing some point where robot can not go through. Then both of manual LIN command and LIN in application will not work.

    You said the robot won't PTP or LIN, but I think PTP will work in that case either with manual PTP command or in application.


    To see if that's the case, try move your robot manually to a position somewhere near, and try it again, then we will see.