If you have the math option in the robot it helps. In the plc we take a Real number and create two INT's; one left side of decimal, the other right side of decimal. Send those values via GI to the robot, in the robot the number gets reassembled in BG logic. New number = Left side + (Right side/1000). For negatives the plc sets a DI and the math just multiplies by -1.
Posts by rumblefish
-
-
Have you tried using bg logic to capture the sysvar for the joint pos? I believe the bg logic scans at 8ms or so. You'll probably still have to compensate for joint speed reaction time.
-
Is the calibrations status "Done" in payload IDENT? I've found when using IDENT if the payload isn't calibrated w/o an eoat for a baseline it won't take new data. Run it w/o tool and enable calibration mode.
-
For DI/DO there is no setup, it's addressing on the pro face side
DI1 > %Q1
DO1 > %I1
Registers will have to have SNPX assignments made. This where the Fanuc HMI manual comes in to play.
SNPX uses R(registers) but they are not related to the numeric "R" registers on the data screen.
In a snpx assignment you assign:
Address - The start of the snpx register to use
size - how many snpx regsisters are required
Var_Name - robot data to read and the format. ie PR , R , SysVariable etc
multiply - decimal precision/place basically
I highly advise reading the Fanuc HMI User manual. Here's a quick setup for reading regisers.
Keep in mind, proface will use the address of 5000 (%R5000). It can be changed to 1.
ASG 8 refers to the sysvar $SNPX_ASG[8]. This is where you set your snpx assignments .You'll see there are quite a few. I just happened to use 8 for this example. It can be changed to asg 1.
ASG 8 Address 5000 Size 50 Var Name R[150]' Data Register 150-175 Multiply 1.0 32b Signed INT This should get you started.
-
Comm setup or the snpx assignment structure? I've previously posted a pic of a comm setup on the proface side.
-
Make the table a "Machine", add servo. "Machines" can have simulated motion. You can then command motion and even use DO to control it. You might make it a conveyor type if you need it to spin continuously.
-
Yes, AGFS is auto grid frame set. It won't move the 7th axis, however it is a much more accurate method of setting frames and cam calibration. The big issue is maintaining the working distance with the camera. (Cam to part distance) The type of vision process you use will also be another factor. I went through 3-4 vp's until I found one that worked for my app. I didn't need 6 degrees of freedom in the offset. Hopefully this helps.
-
You can master the 7th axis where ever you want. Just be sure to record/mark that position on the rail. Verify the gearing for the 7th axis. If you have it set to extended integral you should be able to jog in world/tool and move the carriage (7) with the tcp remaining constant.
Regarding iRVision, if it is a robot mounted camera be sure to use AGFS to set cam cal and uframes.
-
$MNUFRAMENUM[G] will show the active uframe. G being the group number.
-
The IO will config as DO like this:
Rack 36
Slot 0
Start 1 (depending on what you're starting with)
-
So just to update, it turns out the math was/is correct. The X axis error was due to inconsistencies in the locating. I was using GEDIT to find circles, however the vision process cannot find the true center automatically. The org was drifting and not repeating from snap to snap. This was confirmed with FANUC as well. For whatever reason the gedit processes differently than a "taught" image with actual contrast lines.
-
Gotcha,
Usually the common name is a stumbling point for most.
-
What you are referring to are two different "IO Link" busses. The popular sensor bus "IO Link" is a device level protocol. FANUC's IO Link is a proprietary serial bus. They are two separate protocols.
-
This is the SPNX HMI Device option, it's a paid option. You'll need to buy it and get a PAC code from FANUC. It's simple to install, only takes a few minutes.
-
You say the the vision offset is in world, then later you state the part is located in UF[2]. Which is it?
Sorry, it's a work in progress. It was world but I changed it to UF2. World was giving me fits, robot is upside down on a rail and rotated 90 from the UF's.
I had offset only due to being lazy, it is a 2d found pos process so its still spits out found pos in offset.
$MNUserFrame
I think if you added the current user frame, then subtracted the desired user frame. Haven't ever done this, but I think that would work. Let me know if it does....
Thats what I'm doing, taking the difference between the two frames and then multiplying against found pos.
-
I'm trying to figure out to to convert a cart pos from one frame to another.
I have an application which is using a rob mounted cam to locate a product on tray. The tray is a calculated user frame, so the uframe can change based on the tray. The vision process is setup to use world for the offset frame. The issue is that the robot is told where to go on the tray by a parent system and it needs to report back the found pos to the parent. My issue is in trying to convert the found pos in world to the tray uframe. I was trying to take the inverse of the uframe and matrix against the world found pos but am not having any luck.
Does anyone know of a solution for this? I need a nudge in the right direction but I'm not opposed to just telling how to do it either Any help is appreciated!
Here's the current logic.
The part is located in UF2 but needs to be pick in UF3
PR[228]=VR[1].OFFSET (Found Pos Vis Proc)
PR[230]=UF[3] (UF3 Changes based on cart geometry)
PR[230],13=0 (Extended Rail 7th Axis)
CALL INVERSE (230,230)
PR[233]=UF[2] (Static, never changes)
PR[233],13=0
CALL INVERSE (233,233)
PR[234]=PR[230]-PR[233]
CALL MATRIX (234,228,60,160)
I'm off in the X axis and not sure why
PS. The reason inspecting this in world is due to error in the original setup. I changed the vis process offs frame to a static frame that doesn't change.
-
I would just get a smart/managed switch and plug everything in to that. That is the topology for almost every EIP network I've done. Balluff BNI blocks allow you to physically daisy change the network connects.
EIP connects like a normal CATV network and can coexist with other protocols on a network.
Sounds like your on the right track now.
-
What is the "Product Coupler"? If it is a network switch, there shouldn't be any issue with EIP. If you're trying to daisy chain devices, it then becomes dependent on device passthrough for the network connection.
Did the supplier give a reason as to why it won't work?
Balluff is a good product, I have had success with them in the past. Allen Bradley 1734 I/O works well with the robot as well.
-
I don't remember specifically if it is due to crossing an axis or lack of turn data, maybe someone else can better explain it. It still reaches the cartesian point, however (for example) J4 could flip. The tcp is still at the position, just with a different config.
When you say "not going to exact position", do you mean the calculated position vs the displayed position are different? Or is it the robot didn't line up with a part/fixture?
If it is the latter, you need to verify stack up tolerance on the trays/fixtures. If the destination has 20 spots and they were built +/-50um, you could have up to a mm in error at the end. You can't expect a robot to pick +/- .5 mm if the tray is +/- 2mm.
-
One thought, verify the robot isn't changing config when it reaches the point. We do a lot of calculated matrices and I have fought this before. The robot can reach the same point in space with a different joint configuration. Theoretically it should be the same point but from what I've seen it will introduce an error.
Just my 2 cents.