Hi. Anyone out there who knows and has integrated KR5-Sixx (650) + Scorpion Vision Software + Sony Camera?
KUKA + Scorpion vision Software + Sony Camera
-
isapa -
June 5, 2012 at 5:48 PM -
Thread is marked as Resolved.
-
-
What are you trying to do?
I have looked into the scorpion vision years ago. Its theory is that you can take any camera, and use it with its software and within a couple of clicks, your right where you want to be. Personally, I didnt find that to be the case for me. yes there are few clicks, but within each click there are many parameters to add.
I also looked into Keyence software, and used it for a project. I wrote an app for it that allows you to trigger the camera from teh robot as well as view the images.
If i had to choose, I would personally go with the cognex camera. It may not be for everyone, but to me it is as simple as using an excel spreadsheet I can see everything that is going on with it in front of me as well. There are several different methods for using the camera to get the job done for you as well when using the camera.
1. KUKA Vision. This is an app that has been developed by KUKA UK (RIP). This is what i started off using in the past. Depending on what version camera you would use though would depend on my recommendation for this.
2. Serial Communication. I had a problem with kuka vision, and i was going to go route #3 for communciation and was having issues setting it up. When i called Cognex, they told me that the KUKA Robot is only capable of talking to other devices using serial communication.. That made me Giggle.
3. EIP. This is the mode that i used. Configuring it and setting it up was a little challenging, but it can be done.
4. as if i dont have enough pet projects going on, Im going to try to rewrite this into a plugin for the robot in the near future.
is there any specific reason you are wanting to using scorpion vision with the sony camera? What version camera is it? last i remember, you might actually be able to communicate directly with the camera over XML..
-
Hi. Mookie,
We are trying to do VGR for gasketing/ dispensing application.
You are absolutely correct, It is taking us so long to set the scorpion system. Lots of clicks to be done.. We are now stock with this Scorpion Vision... I find the software difficult to set. I just like to ask, if yours has worked. I really need assistance to get set up work as VGR.
We have the robot now "handshakes" with the vision PC. Our problem right now is to program the KUKA to call the coordinates/data from the vision system. Maybe you have an idea how to do it....
Regards,
isapa
-
theres a way to setup the measurement within the software. What it does is basically setup the size of pixels to real world distance. some vision systems use extra tools to assist with teh distortion at the far end of the lens as well.
I tried to setup the software on my machine but cant because of firewall here at work. I will try to in a couple of hours though and see if i can help.
-
Hi. Mookie,
Thanks a lot.
Isapa
-
Dear All,
We have the camera coordinates and robot coordinates set already at the same point. Both camera and robot system is now communicating.
Camera is Sony xcd-u100 with fire/frame grabber 1394b. Software is Scorpion Vision. The camera is now being triggered by the robot...We are trying to do dispensing of liquid sealant (gasketing process-Please click link to see video-
External Content www.youtube.comContent embedded from external sources will not be displayed without your consent.Through the activation of external content, you agree that personal data may be transferred to third party platforms. We have provided more information on this in our privacy policy.The camera is fixed on top of the work table. Our objective is to dispense sealant on the workpiece without jigs and fixtures holding the parts in fix position. Target processes are as follows...
1.) Loading of the work piece at the work table at any position..
2.) Camera will shoot and send data/coordinates to robot
3.) Robot will process the data/coordinates and calibrates its tool to do the process...At this point, we are stock in number 3... data and coordinates were being exchanged, but we do not know what command/ program to write or insert make the robot follow the parts coordinates given by the camera...
Please help...
ISAPA
-
Have you constructed a common reference frame between the robot and the camera?
Have you set up the robot's motion path for the "golden" condition -- where the part is placed such that the offset sent from the Skorpion would be 0?
Is the Skorpion sending offsets to the robot referenced to the common reference frame?If you have completed all those steps, then the basic operation would be something along the lines of:
ReferenceBase needs to be saved someplace in the robot where it is never adjusted except during setup. ShiftedBase will always be ReferenceBase + whatever offsets the Skorpion sends in X,Y,Z,A,B,C. If you have programmed your sealer path in inline forms, you will need to change ShiftedBase to the base used for the sealer path (BASE_DATA[ x ]).
-
Hi. Skyfire,
Give sometime for us to do/check us per your instruction.
Thanks,
Isapa
-
1.) Have you constructed a common reference frame between the robot and the camera?
Ans: Yes
2.) Have you set up the robot's motion path for the "golden" condition -- where the part is placed such that the offset sent from the Skorpion would be 0?
Ans: What do mean?
3.) Is the Skorpion sending offsets to the robot referenced to the common reference frame?
Ans: I am not sure.Regards,
Isapa (Walter)
-
"Golden" condition: Normally in these applications, one sets the part in its ideal position/orientation, and teaches the robot sealer path to that part, with no input from the vision system. The vision system is then aligned such that, when the part is in exactly this position/orientation, it will measure an offset of 0 in all dimensions.
Alternatively, one can set up the vision system, measure the part, apply the Base frame shifts to the robot, and then program (or touch up) the points along the robot's sealer path. This approach is easier, but does require that one always follow the proper sequence of operations (Vision->Base offsets->point adjustments) before making any path changes. The results can be quite unpredictable otherwise.As to the common reference frame: for any robot&vision combination, the two units must share a common reference frame. Otherwise, the 6DOF tranfsorm will not function properly.
For example: you mention that the Skorpion camera is mounted above the work table, looking down at the part. The Skorpion has its own frame of reference (usually with 0,0 in the upper left corner of the image, Y+ left-to-right, X+ top-to-bottom). Even for a simple 2-D measurement in X and Y, the camera and the robot must both be using a reference frame that has X and Y aligned in the same directions. If you wish to add a rotation (X, Y, and rZ being the standard usage for a single-camera system), the two reference frames must not only agree perfectly in orientation, but must also share the same origin.
The simplest way to do this is to construct a new working reference frame in each system. Most vision systems have the capability to create a reference frame from certain points in the image. Most robots can create a base frame from touching three points with a well-defined pointer tool. As such, one of the simplest ways to perform this mutual alignment between the vision system and the robot is to mark three reference points (normally defining Origin, X+ axis, and X+Y+ plane) on the work table, such that they are visible to the vision system and can be touched accurately using a pointer mouted to the robot. As long as both systems use the same three points in the same frame-creation process, the result should be a common reference frame shared between the two systems. The vision system must make the offset measurements that it sends to the robot relative to this reference frame, and the robot must apply those offsets in the same reference frame. -
Hi Skyefire,
Please see attached files containing "robot programs", "camera result string" and "camera captured image" based on our application.
ROBOT:
In the archive file KRC/R1/Program, there are 4 programs:cam2 - sealant path using inline programming
crw - vision communication, cread and cwrite
mainEdited - the main program
var_global - variable declarationsThese are the sample programs sent by the kuka support team as our reference but we don't have enough knowledge to create an exact programs
for our application. We are not yet trained as an expert programmer but we are trying to improve our skills in expert programming.We just edit some of the programs.Running the main program, the robot can trigger the camera to capture the object after sending a request string "X" and then the
camera will measure .
Nothing happened or any movement from the robot after it has been measured.Can you check the programs and have a comment which part needs changes?
CAMERA:
In the file "image" you can see the captured object with a "translation value" below, measured from the base reference point(pointed by the upper pen) to the reference point of an object(pointed by the lower pen).The "translation value" should be the shifted base of the program "cam2" or the sealing path.
What is the right command to get the"translation value" as the new base of program "cam2"?The other file "CreateResultString" is one of the camera tool for a result string.
Regards,
Isapa
-
My normal charge rate is $125/hr for this stuff....
1. Is the command getting from the robot to the Skorpion?
2. Is the Skorpion sending data back? And in what format? See https://www.robot-forum.com/robotforum/kuk…24362/#msg24362. Use the Telnet window to see the raw RS232 exchanges at the buffer level. https://www.robot-forum.com/robotforum/kuk…blems/msg12156/
3. If the robot never moves at all, then probably you are failing the VALUE<>0 check in CRW.SRC.
4. You'll need to add something like this to MAIN, after you finish pulling all the offset values:CodeDECL FRAME _ShiftFrame ; this line needs to go at the very top of the module ; get all the offsets _ShiftFrame.X = Center_X _ShiftFrame.Y = Center_Y _ShiftFrame.Z = Center_Z _ShiftFrame.A = Angle BASE_DATA [2] = BASE_DATA [1] : _ShiftFrame ; keeps Base1 permanent, treats Base2 as a "scratchpad" for the current offset. Base1 must be the common reference frame shared between robot and Skorpion
5. In CRW.SRC, you need to change the points for the sealer path from Base1 to Base2.You are going to have to do some serious study and R&D here if you want to be able to do this successfully. Use the forum's Search function. Dig out the Serial and Expert Progamming manuals and read them fully.
-
kuka vision tech is the new GERMAN Tech pack with full support
Kuka vision rip never liked it too buugy imho -
Hi Skyefire,
1. Is the command getting from the robot to the Skorpion?
Ans. Yes
2. Is the Skorpion sending data back? And in what format?
Ans. Not sure, no signs or any changes from the robot after the camera captured.I will try to edit again crw.src and add your recommended program lines to my main program.
Thank you.I will send you a personal message regarding the charge rate. -
Did you use the Telnet monitor diagnostics, as described in the links I posted?
Are you monitoring the variables involved during the communications process to track any state changes?
-
I already check the telnet monitor they are the same but I used COM 1 for both kuka controller and external PC.
Previously as I checked the variables after the camera captured,there are no changes with the variables involved in communication.
I will try to add something or make some changes in the programs then I will give you an advice next Friday.
I cannot make a test because my camera was used for other purpose. -
COM1? Well, that may be part of the problem. Except on some older controllers, I don't think you can use COM1 on the robot.
Briefly: the KRC2 controller has a sort of split brain: one half is VxWorks, the realtime OS that runs all the critical functions of the robot (safety, motion, I/O). The other half is Windows XPe, which really is only there to make a nice, pretty user interface.
The COM port assignments have changed over time with different generations of the controller, but on any of the controllers made since 2005 or so, I believe that COM1 is dedicated to Windows, and COM3 is dedicated to VxWorks. Which means that only COM3 is available for use from inside the robot KRL interpreter.
I had assumed you had properly configured the COM ports, but if you're trying to use COM1, that's obviously not the case. Use this: https://www.robot-forum.com/robotforum/man…ttach;attach=76
-
Hi Skyefire,
I send you a personal message have you read it?
Isapa
-
kuka vision tech is the new GERMAN Tech pack with full support
Kuka vision rip never liked it too buugy imhoyou didnt like KUKA Vision?
What didnt you like about it?
-
Hi. Mookie,
I will try KUKA Vision Tech
isapa
-