Posts by marlenehj

    Yessss! I wouldn't have guessed that was the issue, but you're perfectly right, it was the ABS function that was causing the error. I've now rewritten the expression as you've stated, and there are no more issues.


    Thank you both SkyeFire and panic mode, I really commend the work that you do on here :)


    Have a great day.

    Just set the $CYCFLAG after BOX_SENSOR_LATCH is initialized. $CYCFLAGs can be assigned and redefined at any time in a program.

    Thank you! Didn't know I could assign it at any point - I naively thought it needed to be declared at the declaration section.


    I've now assigned it after BOX_SENSOR_LATCH is initialized, and I get the following error:
    "Condition ABS(BoxTypeSensor - BOX_SENSOR_LATCH) > 0.1 is not monitorable."

    But I suspect this is due to the fact that I'm working purely offline in KUKA.sim, and I get no actual input from BoxTypeSensor. I might try with a small Python script to mimic an input.


    Thank you for your quick response. It's greatly appreciated :)

    Hi all,


    System details:

    KUKA KR 70 R2100

    KR C5 controller

    KSS 8.7.x (don't know specific subversion yet, as I'm currently working offline in KUKA.sim)

    KUKA.sim 4.1 and WV 6.0


    I want to compare a SIGNAL to a REAL as the trigger of an interrupt. I cannot use:


    INTERRUPT DECL 11 WHEN BoxTypeSensor <> BOX_SENSOR_LATCH DO FreeOfNextBox_ISR()


    As I need a tolerance between the two values of 0.1 at least to trigger the interrupt, since the SIGNAL is oscillating a bit.

    I have tried:


    INTERRUPT DECL 11 WHEN ABS(BoxTypeSensor - BOX_SENSOR_LATCH) > 0.1 DO FreeOfNextBox_ISR()


    But I'm prompted that the ABS statement is not allowed in an interrupt.

    Then I tried with a cyclical flag, but my BOX_SENSOR_LATCH value is not initialized before later in the execution, so the interpreter complains that it hasn't been initialized upon creating/declaring the cyclical flag:


    $CYCFLAG[1] = ABS(BoxTypeSensor - BOX_SENSOR_LATCH) > 0.1

    Any suggestions on comparing two values including a tolerance in an interrupt declaration?

    Hi Panic Mode,


    Thank you for your response.


    I'm working on an employer-administered PC, and there has been pushed some changes to the security policy, hence the access issue. I can easily see/find the files myself in file explorer etc., but WorkVisual prompts access denied when trying to download project into the "Downloaded Projects" folder.


    I have copied the entire "WorkVisual 6.0" folder into a directory which is not tracked by the newly added security policy changes, and that should allow WorkVisual to access them. Then I changed the directories where possible, i.e. for "Project handling", "Online Workspace" and "UpdateManagement" as you also stated, which now allows me to at least use the Programming and Diagnosis tab. But there is no possibility to change the "Downloaded Projects" folder path in WorkVisual settings, and this means I cannot download projects in the Configuration and commisioning tab, which is super annoying..

    Hey,


    Does anybody know, how to change the location/path of the "Downloaded Projects" folder from WorkVisual - i.e. the folder that stores the temporary projects from the controller, when performing a "Browse for project". On my PC it is located in Documents\WorkVisual 6.0\Downloaded Projects but I'd like WorkVisual to reference it elsewhere, as this folder is currently inaccessible to me.


    My WV version is V6.0.22


    TIA!

    Hi everybody,


    System information:

    • KSS 8.7.3
    • HMI V 8.7 B442
    • Kernel system version KS V8.7.441
    • WorkVisual V6.0.22
    • KR C5 micro controller
    • KR 4 R600 robot

    I have an application, where I want to grip an object with a vacuum gripper (I do not have GripperSpotTech, just fyi). There is a vacuum sensor attached, which is supposed to record if vacuum has been lost, i.e. the object has been dropped accidentally or the vacuum is faulty. The signal is a BOOL, where a high signal means vacuum is on, and the object has been gripped, while a low signal means no vacuum has been established and hence no object has been gripped/the object has been dropped. As of now, the sensor is very sensitive despite setting the sensitivity way down, which means that small fluctuations in the signal occurs during movement of the robot arm, so the sensor will say the object has been dropped even though it is in fact still in the gripper.


    I am monitoring the sensor via a couple of interrupts, which will be triggered, if the object has been gripped (LOW-->HIGH), and if the object has been dropped (HIGH-->LOW). Right now, I'm just sending a warning to the PLC whenever the interrupt is triggered. However, to avoid falsely detecting that the bag has been dropped, I have to look at the signal over a period of time to see if it remains LOW or if the interrupt was triggered on a false-positive fluctuation.


    I was thinking that I'd like to start a timer once the interrupt is triggered and then check the signal again after the timer has run out, to see if it is still LOW, but I don't know the best way to do so without interfering with the motions of the main program.


    Does anybody have a suggestion on how this is best done?


    TIA!

    Thank you for your response, and you're right. I'm in contact with KUKA but they think it is the firmware of the SmartPad that needs updating - however, that doesn't really explain why the SmartPad works fine with three other identical controllers.


    Could be that somebody on here has experienced the same thing, but maybe it's a standalone issue :)

    Hi everybody,


    System information:

    • KSS 8.7.3
    • HMI V 8.7 B442
    • Kernel system version KS V8.7.441
    • WorkVisual V6.0.22
    • KR C5 micro controller
    • KR 4 R600 robot

    I have just received a brand new KUKA KR 4 R600 which has been wall mounted. I've started it up with no errors and then downloaded the project from the controller to WorkVisual. In WorkVisual I am able to set the robot at inclined installation (90 deg in B) in the machine data editor, and I'm also able to edit it on the SmartPad in $ROBROOT.


    In order to accept this change in hardware (from floor mounted to wall mounted), I need to access the safety configuration. However, when trying to do so, the HMI application crashes and leaves me on the Windows screen, with no other option than to restart.


    It should be noted, that even without this change (floor to wall) and a completely empty project, it cannot go to the safety configuration without crashing. Additionally, I have three extra KR C5 micro controllers with identical configurations, where this particular SmartPad works with no crashing and I have no issues opening the safety configuration nor accepting the new changes.


    Has anybody experienced a similar situation and perhaps could point to a solution?

    VisionTech is quite powerful and allows a lot of customization. i was modifying C# code and adding own data etc to the results. if using another vision system one can do pretty much anything as long as message is in format that PickControl expects (if you are still using PickControl).


    note that synchronization of camera and conveyor can be an issue depending on conveyor speed. this usually means using high speed output from camera to trigger conveyor or separate registration sensor or conveyor speed need to be sufficiently low.

    Ah okay! We were under the impression that there was little to no customization possible with VisionTech. That's good to know.


    With regards to synchronization, does this primarily apply to high conveyor speeds? The conveyor speed for this application will be around 300 mm/s. We want to trigger the vision on a switch on the conveyor and start tracking the item upon the registration, and when the image has been processed, we want to "marry" the registered item with the correct (x,y) position on conveyor an orientation of item given by the vision system - I believe I read that at least with PickControl that was possible with the use of a timestamp check.

    I cannot comment on anything else but just out of curiosity, is the vision system by any chance based on a omron FH controller?


    Even as a big fan on ML, if you can make it work without the ML part you will most definately have a much easier task at hand. What are you trying to find that you cannot do without ML? :/

    No, the vision is not based on an Omron FH controller.

    I agree, that using "traditional" computer vision would be a much nicer road to go down. However, the reason for using ML or deep learning is that the items to be picked have little to no markers that will allow for us to detect the correct orientation of the item (critical for further processing) with standard vision techniques, and the items will vary widely - so what we are seeing today, might not be what we are seeing a week from now, but we should still be able to find the "right way up" in regards to orientation.

    Thank you for your quick response panic mode. Firstly, the new version of ConveyorTech allows for just one resolver connected to the first robot controller, and then the resolver signal will be shared over a RoboTeam network - so they've made this part a bit easier rather than having three resolvers for three controllers. However, I do agree with you that it seems I will need to do quite a bit of customization still with the setup I have.


    Unfortunately, it is not viable in this specific project to use VisionTech, as the vision task is not trivial - we will have to utilize deep learning, so VisionTech will just not cut it. Hence the need for an external vision system, even though this complicates things further as you've correctly stated.

    massula thank you for your response. That is exactly what I was afraid of, that information on the orientation could not be passed on. I see how your example application relates to this, but I can't help but wonder if I'd just be wasting time "reinventing the wheel" when it seems PickControl will do this for you. You mention you've worked with ConveyorTech, have you also worked with PickControl?

    Hi everybody,


    I've purchased four KUKA KR4 R600 robots with C5 micro controllers running KSS8.7. I've purchased KUKA.ConveyorTech 8.1 with my robots, and I will elaborate on that below. I'm working with WorkVisual 6.0.


    So, first of all, I have not yet received my robots, so I'm not able to test anything just yet - I'm trying to prepare as much as I can beforehand because I'm on a tight schedule.

    The intended operation is as follows: Three of the robots are to be picking from one conveyor. The position and orientation of the workpieces are given by a vision system (non-VisionTech) which is running on an IPC and sending information for each workpiece to a PLC, which will distribute a given workpiece to a given robot. Now, I have been advised to purchase ConveyorTech over PickControl, but I am having my doubts that ConveyorTech will be able to fulfill the requirements of the system. The main concerns I have are:


    1. Load sharing. Specifically, when one robot is unable to pick an item, PickControl will allow for the next robot to pick it instead, whereas with ConveyorTech, this is something to be implemented in the PLC. However, it is not important that all items are picked, since they will roll back into the feeder if they are not picked, but I presume the load sharing properties of PickControl will give a nicer flow.
    2. External vision system. I've not been able to find out in the ConveyorTech manual, how to "marry" a workpiece's position and orientation (given by the vision system), with the tracked workpiece in ConveyorTech (given by the sync switch). It is critical that the workpieces are picked with the correct orientation of the tool because of later processing. Does anybody know how to do this in practice?

      Additionally, with PickControl it seems that it is not quite straightforward to integrate an external (non-VisionTech) vision system in PickControl - does anybody have any experience with this?
    3. Object types. The workpieces are of different types, which contain information on how they should be handled in a later operation (by the same robot). After reading the PickControl manual, I understand that you can assign object markers to the items to be picked, which might be able to specify what I'd like to have attached to each item. Does anybody know of a similar thing in ConveyorTech?

    All in all, the good thing about ConveyorTech is that it seems a lot easier to set up than PickControl - I'm just unsure that it will be able to perform the way I want it to, despite it being the recommendation from KUKA.


    So, you guys. Does anybody have advice on the three points highlighted above, or advice on the choice of ConveyorTech vs. PickControl for this specific use case?