Posts by Seulki

    Hello,


    It sounds a picture below is what you need.

    Note that there's no configuration on X76 through X33/34. they are just wires from the base to MF installed inside of a manipulator. There's no EtherCAT Coupler in it, no direct control from the Controller. Simply just wires that you can use as your own as well as power sources of your own.


    And the pin names [A,B,1,2,3,4,5,6,7,8,9] are shown on the X76 connector kit that you can assemble on your own. If you don't have it, you need to get one from KUKA(both X76 side and X33/34 side).

    Hello pdoh,


    I've never been aware of using Servoing and FRI simultaneously can work.

    Probably you've thought about it since FRI doesn't support commands in Cartesian space.


    If your Cartesian pose part is not under real-time control, you can localize the Cartesian pose part with ordinary motion. And let the overlay take the Wrench part.

    -> Not really sure if it's applicable to you, but you can still get Cartesian pose data through another communication channel from your client, if you need to get it externally.


    And if you need the Cartesian pose part still be controlled in real-time, you can stick with -only- FRI, and try transformationProvider-there's an example code- along with Wrench command.

    As far as I understood, transformationProvider offers the Client side to transform a given Frame(Base) which the Server side will get affected in real-time.

    So a command like 'lbr.moveAsync(controlledFrame).addMotionOverlay(ov);' will be controlled in both Cartesian pose and Wrench.

    -> Never tested by myself but it might worth a shot for you.

    I can't see the whole parameters of your space monitoring.

    You have to specify the origin(x,y,z,a,b,c) of your cube, and then length/width/height of it.

    Probably you've missed the last three dimensions.


    Plus, AMFs are acting as 'and' conditions. reaction occurs when all three AMFs are violated.

    and 'none' is always violated.

    Meaning, you need to select 'handguiding device active'(AMF2) if you wish to monitor the space(AMF1) only when using handguiding.

    Now, it means the space is monitored always except when you handguide.

    Hi wang,

    If you want to make safety configuration regarding 'space', you need 'Safe operation' software option.

    It seems you haven't activated it on StationSetup-software. If you don't have it on the list there, then you don't have it, and you'll need to purchase it from KUKA.


    Handguiding mode, you need an enabling device such as a Media flange with Touch option.

    If you have two buttons on your MediaFlange, you can set it to be used as handguiding enabling switch under KUKA PSM, [enabling device] I guess it should be on the second row, AMF1

    It worth try check 'region and language' settings on your Windows.

    Try change the 'format' into English(any)

    and 'System locale' under the last tab into English(any) as well.


    I guess it makes trouble on networking while using a language which has special characters other than English alphabets.

    Hello stcxkr,


    Yes it's possible. But you should keep in mind that it's merely using native Sunrise controller with given methods, not by redirecting I/Os to an external devices directly.


    1. You could make use of backgroundTask for ethernet connection.

    2. On your backgroundTask, you will have to make some functions that controls IO from the received message.

    -> Methods controlling I/O are automatically generated as 'MediaFlangeIOGroup.java' if you've selected it correctly while making a project.


    - The code below is to briefly show the concept. You'd have to take care of EtherNetComm and its parsing.

    - I'm not sure if you have @Inject on SunriseOS 1.11. if not, manually initialize them.

    Hello,


    if you're using an old 1.7 version, I guess you could still find it as below,

    As far as I know, these functions are not provided and supported by KUKA since 2016.


    Code
    ModelParameterRequestService MPRS = new ModelParameterRequestService(lbr);
    IModelParameters MP = MPRS.requestModelData();
    
    Matrix mass = MP.getMassMatrix();
    
    Matrix jacob = MP.getJacobianMatrix();

    I know I'm late to say, but just few information on this.


    The problem is that 'RoboticsAPI.data.xml' file has been corrupted probably during shutdown-reboot mostly when the shutdown process wasn't properly carried out. you can easily fix this with simply synchronize again your project from Workbench to the controller.

    - If the synchronization doesn't work, you will need to clean everything under 'ApplicationServer\Projects\' and 'ApplicationServer\ProjectRepository\"(might be spelled wrong) on the controller side(using RDC or physically). and then try synchronize again.

    The options that relate to Safety are "HRC" and "SafeOperation". If you're sure you have purchased them from your provider, it should be installed on the controller-you might want to check the stickers shown when you open up the cabinet-. Or if you're Sunrise Workbench has the option already, it could simply be done by activating those options with "StationSetup.cat - Software" on your project. And you need to do "Installation" once after changing any of StationSetup.

    Due to the decreased impact we have not found time and priority to do measurements or modifications on the sunrise https://github.com/IFL-CAMP/iiwa_stack) itself (I assume you mean code on the sunrise, in our calling code (fork of ) I didn't find the class or methods you mention)

    If you're sure iiwa_stack is running with FRI, then it has to be there since it's the first thing needed to create FRIsession. (Below is the applicationExample shown when activating FRI software option on stationSetup.cat)

    Code
    // configure and start FRI session
            FRIConfiguration friConfiguration = FRIConfiguration.createRemoteConfiguration(_lbr, _clientName);
            friConfiguration.setSendPeriodMilliSec(5);
    
            getLogger().info("Creating FRI connection to " + friConfiguration.getHostName());
            getLogger().info("SendPeriod: " + friConfiguration.getSendPeriodMilliSec() + "ms |"
                    + " ReceiveMultiplier: " + friConfiguration.getReceiveMultiplier());
    
            FRISession friSession = new FRISession(friConfiguration);
            FRIJointOverlay jointOverlay = new FRIJointOverlay(friSession);


    - I have no background on ROS so my statement below probably be wrong.

    I briefly look on the source code on iiwa_stack.

    I could see it's using Servoing(smartServo) on it's SunriseApplication "ROSBaseApplication", but neither I found lines related to FRI.

    If it's using FRI, the application running on SunriseOS need to contain FRI stuff to be as FRIServer while the external devices communicating with Sunrise has to be FRIClient. If there's none of structures programmed as below, it's not using FRI.


    And if it's using Servoing, I guess you need to take a look at periods of timers or your own "controlLoop()"

    - How can I confirm which rate the Kuka is currently expecting

    - How can I change this setting?

    Hello, you might want to check out on

    FRIconfiguration class - must be initialized somewhere by the iiwa_stack (I guess?)

    and its methods like .setReceiveMultiplier() , .setSendPeriodMilliSec() - you need to check both and their explanation

    As far as I know, the real-time control options of LBR are both working with methods that we offer new target positions before it actually reach to its previously given target. So the interpolator does not perform exact stop and always moves toward the given renewed target. Thus it's mostly used for tracking things in real-time, with assistance of vision, simulator, control device, sensors or any of such that can give the robot positional information.


    If you can feed target positions from the external device(through ROS or tracking devices for a basketball) based on -nearly- real time, using FRI or Servoing(simpler and lower performance version real-time controll; does not require external server program with aid of rtOS, period for accepting new targets is more than aprx. 14ms) is a fine idea for me to think.

    The workaround we can think of is to start a movement towards some pose but never use the interpolator commands but use our own interpolation to move the robot to the desired goal. Once we are finished with what we want to do we can drive the robot to the initial goal that iiwa.move(...) wants to reach and stop the robot application.

    I'm not sure if I've understood your statement correct, but if you stick to conventional way of commanding motions, the robot will always stop after each motion. Unless you give the whole series of motion commands before starting the motion and make them as one Spline motion or blend them with moveAsync() / setBlending().

    You got different project settings(Software options) on both sides.


    1. You may want to download the project from the controller first, and then work on it.

    2. You look for those software options that are not matching between the project of your controller and of your workbench, and select those on your workbench that are missing.

    - Under [StationSetup.cat] - [Software] tab

    3. If you are sure that your project of your workbench is the right one you need, you need to "Install" the project before trying "Synchronize".

    - Under [StationSetup.cat] - [Install] tab. Installation will delete whatever was in the controller, and install software options that you selected on your workbench. Installation is needed only once, now you can just Synchronize.

    There always be slight offsets when you use Impedance control for there are inaccuracies of torque sensors, of mass data of the tool, and of innate kinematic issue on robot poses that enlarge tolerances of torque sensors.


    One, you can make use of additional positionMode motion right after impedanceMode motion.

    Code
    ObjectFrame tcp = ToolTest1.getFrame("/TCP");    // btw, it'd be easier I guess
    tcp.move(lin(frame2).setMode(...));
    tcp.move(ptp(frame2));    // will compensate the slight position offset. ptp is preferable than lin.


    Two, you can get offset and compensate it by another motion with this offset.


    However both of these will consume much more time than you wanted.

    No unfortunately it's not possible.


    One, you have to have each project for each cabinet, having different IP settings.


    Two, if you really mean to have exactly same project for both robots, install the project to one of the cabinet so both of the controller will have same IP. Then you can synchronize without changing project IP setting, but of course you need to unplug & plug an ethernet cable, can not use switching hub.

    I think i have to convert the gripper's interface into Ethernet, if i connect it to the X2 port. So I can directly control my device via the X66 port at the cabinet, if my device is connected to the X2 port?


    BTW, I have another question: is there any way to directly read and write the X3 port of the flange instead of using Sunrise?

    No and No.


    As I mentioned, wires from X2 to X66 are not simply wires. You can not modify any of interfaces from flange (except X74/75 to X76).


    Same goes for X3. they are coupled through EtherCAT coupler inside of the robot and only communication is done via controller-robot cable.

    You need to use Sunrise as an interface between your device and a robot application.

    Could be like
    Sunrise[robot application] - EtherCAT - X3 DIO - your device.


    Or if you don't like any of these idea, just use external cabling of your own, and tie them up on "KUKA" marked silver plates.

    Hello,


    First of all, X2 is an EtherCAT interface not an EtherNet. It is kind of already coupled inside of the robot and to the controller which means you can not hijack it in between its connection.


    Second, Robotiq gripper's EtherCAT interface is supported by LBR iiwa. You can simply import a device description file to Sunrise(WorkVisual is needed) system and control Robotiq gripper in your robot application-java code-.


    Third, If you want a simple direct wires on the flange, you have X74, X75 which are directly bypassing the robot body and going out through X76 at the base of the robot. You can make use of that interface on your own. It has capability of 2 power inputs and 6 (3 x paired and shielded) lines.


    Fourth, the last question regarding WV, is in chapter named "Bus Configuration". If you are looking at END user manual, ask an SI manual to your provider.


    Fifth, if your device's interface is converted into EtherNet, you probably want to use KLI interface on the controller, right above the power switch. Yes it's the port that you use when you do installation and synchronization. I personally think it will be the easiest way to do it if you're well aware of socket comm. programming. Or to use an EtherCAT(Second proposal) would be recommended.


    Regards,

    You could try override dispose() method in your backgroundTask.


    Code
        @Override
        public void dispose() {
            ServerSocketClose_();
            
            super.dispose();    // not so sure this is needed. just my habit.
        }


    And this dispose() method will be called when you press the stop button of the backgroundTask on HMI.

    - you could do it by manually pressing this stop button, or other way to directly call dispose() when you need. ("END" packet from a client or so)

    However, You still need to stop-call dispose()- it before synchronizing.

    I experienced a bug-ish reaction when I try synchronize project without manually stopping a background task.

    It seems the instance is not properly disposed if it the dispose() method hasn't been explicitly called.

    Hello JM,


    I think you can't do it with a one line code from a background task if you're not using Sunrise,StatusController option.

    However you can still make use of other things.


    You could get ITask instance of your Application class.

    Code
    --Your backgroundTask
    
    @Inject
    private ITaskManager taskManager;
    
    run() {
        ITask robotApp = taskManager.getTask(YourApplicationName.class);
        // use try-catch - You will get TaskNotAvailableException if your Application is not running.
    }

    You could also make use of ProccessData as a Flag to see if your Application is running.