Touch Sensing RARCWMDP & #ARC_WMCV commands

  • Hello KAWASAKI Guys ,


    We are working on an ARC WELDING project using KAWASAKI BA006L robot with KEMPPI A7 Machine .


    We are writing the overall program using AS language (Not by Teaching) .


    Everything is fine except the touch sensing option .


    We start using the XAC function & it is working fine as expected .


    We are using touch Pattern 4 (3 Points) .


    The issue is :


    - At the time we are teaching the robot the welding path points ; How we can inform the robot that following touch points are the references for future welding ?


    - Assuming that we have changed the work piece by new one & touch sensing performed by the robot ; How we can get the corrected points (Transnational or Joints) or the deviation values ?


    - In the ARC welding Manual we found the instructions of RARCWMDP , #GET_ARC_WMDP & #ARC_WMCV but we don't know how to use them to save reference touch points and to get the deviation for every new workpeice .


    It will be highly appreciated to get any suggestion or sample code about using touch sensing in AS language .


    Thanks in advance .

    Regards,<br /><br />SAQER ALI<br />ROBOTICS WORLD FZE

  • I always assumed work deviation patterns were only applicable with BLOCK instructions (please correct me,someone if I'm wrong).


    AS uses XAC and you use that command to move to the target, and upon detection, teaches the new target instruction location - therefore creating your 'psuedo' pattern as required.


    - At the time we are teaching the robot the welding path points ; How we can inform the robot that following touch points are the references for future welding ?


    XAC taught_pos, detect_pos, dist_to_chk,spd_of_chk.

    LDEPART 100

    LWS detect_pos ; will move to the detected position and commence weld.


    - Assuming that we have changed the work piece by new one & touch sensing performed by the robot ; How we can get the corrected points (Transnational or Joints) or the deviation values ?


    Corrected points are stored as transformation values, therefore by:

    POINT taught_pos+pos_diff=detect_pos

    Will result in the values of pos_diff being the difference between taught_pos and detect_pos

    You could then decompose the pos_diff values to convert them to real values if required.


    - In the ARC welding Manual we found the instructions of RARCWMDP , #GET_ARC_WMDP & #ARC_WMCV but we don't know how to use them to save reference touch points and to get the deviation for every new workpeice .


    These are new to me and also only in the latest manual, I have tried them in KROSET and get no errors, but the results I can't quite get my head around.

    I suspect they are intended to obtain deviation values when used in conjunction with BLOCK.....but I am completely guessing.

    This I think you could do with contacting Kawasaki about to confirm the function and application of these.

  • Many thanks for your kind feedback .


    I do agree with you that Touch-Sensing patterns are applicable for BLOCK teaching and we have used it previously for small scale projects but as per new ARC WELDING manual ; we found new instruction & we tried them in reality but as you got from K-Roset , results are not logical at all .


    I am in touch with KRG (Kawasaki Germany) for those new commands but they are also don't have any idea about those new commands .


    I informed KRG that now a days most of welding paths are generated by using OLP softwares which means touch sending commands should be applicable for AS language commands ; otherwise no need for OLP software for Arc welding applications !


    Unfortunately we don't have access to KHI in Japan to know exactly how to use those new commands for AS .


    But i am sure that we can use touch sensing in AS since many OLP softwares are publishing that their solutions are supporting KAWASAKI touch sensing like Delfoi & Octopuz .


    if you can support us by getting more details from KHI ; Will be highly appreciated .


    Thanks again for your support all the time .

    Regards,<br /><br />SAQER ALI<br />ROBOTICS WORLD FZE

  • Quote

    I am in touch with KRG (Kawasaki Germany) for those new commands but they are also don't have any idea about those new commands .

    This doesn't surprise me with the commands being new, but I expect if you make a request to them, they may do some research on your behalf to ascertain functionality of those commands.

    However, what I have tested in KROSET for touch sensing (not that KROSET is the easiest to simulate touch sensing), they are specifically directed towards BLOCK programming only and I wouldn't be surprised if that was their final answer.


    Quote


    I informed KRG that now a days most of welding paths are generated by using OLP softwares which means touch sending commands should be applicable for AS language commands ;

    I think KRG will already be aware of this fact and they will end up referring you to the XAC Command as this is the AS Command specifically dedicated to Arc Weld touch sensing.....not workpiece deviation.

    Work piece deviation appears to be something you will need to code in yourself from an AS perspective.


    Quote


    But i am sure that we can use touch sensing in AS since many OLP softwares are publishing that their solutions are supporting KAWASAKI touch sensing like Delfoi & Octopuz .

    Again, you can by using the XAC Command, what you appear to be asking is a 'clickable wizard' that you say other OLP's are already providing.


    What sort of code are Delfoi and Octopuz producing then for Arc Welding, are they including the touch sense commands or are they just creating points and basic motion instructions to them?


    I can make a program using a whole host of standard AS Commands to create a touch sensing sequence:

    ACCURACY 1 FINE

    XMOVE target+offset TILL 1002.

    Or

    ACCURACY 1 FINE

    LMOVE target+offset, 1

    SWAIT 1002


    But I would not need to as there is XAC which is a dedicated Arc Weld Command that automatically links in with whatever IO I dedicate to it.


    The only thing I could recommend is to look at KROSET Touch Sensing Demonstration Program.

    This contains some very good information regarding AS commands and Arc Weld Touch Sensing and if Delfoi or Octopuz are producing this type of code, I may need to eat my own words...….


    The other alternative is to try and seek KCONG, that is dedicated to Arc Welding OLP but is very expensive and my last try, they are very reluctant to supply this outside of Asia.

  • The other alternative is to try and seek KCONG, that is dedicated to Arc Welding OLP but is very expensive and my last try, they are very reluctant to supply this outside of Asia.

    Yes ; We have tried to get KCONG to use it for welding & Milling & we got exactly what you said ; it is dedicated to Asian market only .

    However, what I have tested in KROSET for touch sensing (not that KROSET is the easiest to simulate touch sensing), they are specifically directed towards BLOCK programming only and I wouldn't be surprised if that was their final answer.

    Thanks for this result ; at least we know that those new commands are for BLOCK teaching & no need to keep trying them in AS.


    I think KRG will already be aware of this fact and they will end up referring you to the XAC Command as this is the AS Command specifically dedicated to Arc Weld touch sensing.....not workpiece deviation.

    Work piece deviation appears to be something you will need to code in yourself from an AS perspective.

    We are trying to write a code for compensating the workpiece deviation by using XAC command.

    Please check attached code .


    I would like to get your kind recommendations to enhance this code & make it more reliable .


    robot-forum.com/attachment/24639/

    Regards,<br /><br />SAQER ALI<br />ROBOTICS WORLD FZE

    Edited once, last by SAQER ().

  • I've had a look over it and it works quite well (in KROSET).

    I look at things with very simplistic glasses and what you've written is very clearly broken up into sections, easy to follow and flows in terms of sequencing and does the job indeed.

    Very difficult to offer any recommendations, except alternate methods.


    I would definitely consider incorporating the XMWIRE command though to set an even wire length prior to touch sensing.

    - This will give you the best results for touch sensing/tcp values and reduce errors.

    - Especially if you are using high crater currents to finish the weld on the previous cycle.


    You could adapt to using dedicated reference points on the work piece for X and Y touch sense points as opposed to very close to the intended weld start and end points.

    - Only because this allows for more flexibility around any shape workpiece you intend to weld.

    - Also if you were to introduce a work object frame, then the X/Y Planes of this could be references.

    - By using references too, you could then locate the start weld point based on the XY shift results too.


    Also, DECOMPOSE is very good command to use however if I was targeting just X and Y, then I would favour DX and DY to create the x difference and y difference values.

    However, using the DECOMPOSE gives you access to more which is there if you need to access it, and to be honest, I personally am trying to use this more and more as opposed to just DX/DY/DZ.


    Additional to this, I would possibly utilise the SHIFT in the LW commands, then incorporate the x difference and y difference values into them.

    - This incorporates a slight protection of users being able to POS MOD them easily, which can be good or bad depending on your customer.


    I've attached a video and displaying the running code.

    I wrote it based on what you wrote, but condensed it somewhat......but other than that, what you have done thus far looks ok to me...………:top:

  • You're welcome...……:top:

    Just remember, touch sensing can be applied in many ways:

    - Locate an actual workpiece.

    - Locate a start point to weld.

    - Measurement for work deviation or checking.


    Try different methods when you completely understand the function, this way you will produce a good 'working template' for future installations.

    And I have learned something from your example too, so thank you for that also...……:beerchug:

  • hi


    this touchsensing does it work thru a current making a circuit activating a signal.

    or is it something with incountert force .


    example: can i touchsense with a shunk gripper , to locate the position of a new pallet/crate

    with products.


    sorry for the errors in the writing


    greeting bertus

  • Quote

    this touchsensing does it work thru a current making a circuit activating a signal.

    Yep...……..:top:

    The current is provided by the weld power source in touch sensing, not the Robot.

    Robot sends request to weld power source to provide voltage (somewhat variable dependent on parameters), as there is voltage, current is available.

    If current flows, weld power source sends signal back to say there is current flow - Therefore touched.

    These are all built in functions within a weld robot and weld power source.


    How could you conduct electricity through a pallet, you're medium would need to be conductive.

    Therefore, Kawasaki has a command called XMOVE (also available in BLOCK), which includes a signal feedback function for this instance.

    - You would usually use infra red/ultrasonic/laser/proximity switch in handling applications.

    - I have sometimes used an 'old mouse click button' to do some testing by strapping it to the end of the gripper.

  • hi

    thanks for youre reply .


    my pallets are made of steel and i position them in a fixed frame, but it is not acurate enough and needs

    to much time to place the pallet correctly +- 1cm .

    if there is a good easy and quick way to correct the misplacement that would save time and downtime.


    i was thinking of al little program that automates the reteaching of the frame/ori/position.


    is colision detection a option or soft absorber or is it easyer with a little current ?



    greating bertus

  • Collision detection/Shock detection 'detects' force, where Soft absorber 'allows' force.

    Neither are accurate methods of product detection.

    Quote


    if there is a good easy and quick way to correct the misplacement that would save time and downtime.

    You would be better off using the XMOVE and standard object detection sensors for something like that or possibly employ a vision system above the pallet location to send the new location coordinates to the robot.

    The most effective and common technique would be to use actuated clamps to square the pallets in place.

    This would negate the extra cycle time required to 'search' for misplacement and probably be more cost effective.

  • Many thanks for the video.

    I can see what you mean now about asking about touch sensing as a target locator.

    It looks like a very basic solution (I don't mean that in a negative way), I am surprised to see RFA on the side of the robot.

    From my experience RFA provide a very solid solution to machine tending/palletizing and tend to provide good/solid integrated solutions too.

    This does not look in line with the RFA I know, have they just supplied you the Robot system then and you are integrating it yourself?


    Looking at the solution you have I can see:

    - Part Presentation (Pallet reference to the robot is not guaranteed), but I assume (cannot see clearly) the pallet is located into a 'fixed right angle'.

    - Crate Presentation on the pallet (Crate reference to the Pallet is not guaranteed).

    - Crate Presentation on the crates (Crate reference to Crate is not guaranteed).

    - Tolerance on gripper (when gripping and when releasing can introduce XY and orientation changes).


    These 4 variables alone require some sort of compensation adjustment at the robot side.

    Any type of conventional touch sensing, I doubt would yield repeatable results (however, I could be wrong).


    I know with my electronics knowledge you could use standard IO and 'emulate' touch sensing, but there are clear grounding issues.

    - The only effective solution here would be an output to send voltage, and current detection feedback to an input.

    - Alternatively you could utilize a laser or physical proxy switch.

    - However, as for the results, could this be made to work to detect XY and orientation change - Possible, but I think some further testing is required.


    Saying that, you would need to repeat the detection, not only on the 'pick up' of the crate', but after you have 'placed the crate'.

    - Due to the gripper tolerance and movement between transfer and put down.

    - This may impact on your cycle time.


    I am surprised to see on the crates, that there are no dowels/edges to 'square' them up.

    - This would be my first question, could the crates be modified for this.

    - This would result in you having a 'squared' stack on the empty and the completed sides.

    - Thus only leaving crate-pallet reference and gripper tolerance to solve.


    I think you may be best served by creating another thread in either the main Kawasaki board, or General Robotics Board, I think you would receive a more varied result of responses in effective solutions to this.

  • - Part Presentation (Pallet reference to the robot is not guaranteed), yes the pallet is located into a 'fixed right angle'.

    - Crate Presentation on the pallet (Crate reference to the Pallet is not guaranteed). crate is sligtly bigger than the pallet

    - Crate Presentation on the crates (Crate reference to Crate is not guaranteed). it is not very clear on the video but there are flipthings present

    - Tolerance on gripper (when gripping and when releasing can introduce XY and orientation changes). when it puts the products back ,that is not an issue

    I put the comments in bleu

    Many thanks for the video.

    I can see what you mean now about asking about touch sensing as a target locator.

    It looks like a very basic solution (I don't mean that in a negative way), I am surprised to see RFA on the side of the robot.

    From my experience RFA provide a very solid solution to machine tending/palletizing and tend to provide good/solid integrated solutions too.

    This does not look in line with the RFA I know, have they just supplied you the Robot system then and you are integrating it yourself?

    I am more than 100% happy in what RFA diliverd the crates are my idea (that idea is that de products are loaded on to the crates at de sawingmachine

    so there be no hands on it till they go to the customer) therefore saving al lot of handling like with other solutions like Halter/Selro , putting 20 or 30 products on a grid is not the most effective ways to make products that on averige take 5/10 minutes to make. with the crates i have products that have

    runtime for almost 40 oures 480 pieces times 5min

    The other robot in the back has vision with conveyor belt but it hold only 80 to 150 products that you have to load and unload, therefore there is still handling . ofcourse it is a work in progress that never stops but even when you think it is pervect you need to improve ,always improve!


    I asked rfa to make a system that simpel even my mother could use it , and they dit just that ! and for the cost of 1857 oures of labor (salary saved for the guy that normaly put the products in) earn back time is very short.


    greating bertus

Advertising from our partners