It shows them as unassigned in your picture.
Posts by HawkME
-
-
To do that I think you'd have to put in a BG Logic program that constantly checks the current position, compares it to the previous position, and adds that value to a running total. Typically those values don't update while jogging though.
A much easier thing to get would be the total mechanical motion time. That is saved in a system variable.
-
A couple options you could try.
1. Don't connect your external supply to the Fanuc IO at all. Only use it for 24 and GND on the schunk. For gripper open close just use the robot DO powered from the internal Fanuc supply.
2. Same as 1 but put a relay in between the robot DO and Schunk open/close and use the external to power the signal open/close.
Option 2 is a little safer bet.
-
You should be waiting for UO cmd enable to turn on, then pulse UI 18 prod start.
-
You need to boot into a controlled start, maintenance menu, and modify the robot and axis setup.
-
Most do not have a regular IO card in the cabinet but some models do. You can always add as an option.
Almost all have some IO on the shoulder of the robot, typically 8 inputs and 8 outputs.
-
I would think angular speed would be somewhere in the variables.
To do the calculation you simply need to calculate the time rate change of position.
Speed = dP/dt.
You could do this in BGLogic:
dP = currentAngle - lastAngle
dt = ($Fastclock - lastTime)*500
Speed = dP/dt
lastAngle = currentAngle
lastTime = $Fastclock
Just replace the fake variable names above with Registers and currentAngle with the system variables. The 500 constant is because $fastclock is in 2 ms increments on most controllers. You would need to use a timer to verify that constant and change if needed.
This should give you results in degrees per second.
-
Main prog:
1: RUN Main1;
2: RUN Main2;
The RUN command is non-blocking so they will both run simultaneously.
Just make sure Main has no groups in the mask. And 1 and 2 only have their own group in the mask.
-
Probe 2 points along each axis. Calculate the slope of each axis line. Then calculate W, P, and R by taking ATAN(slope) for each.
I would suggest using matrix multiplication to adjust the WPR values.
You can set a UF = PR to create a UF
-
I will give you the equations needed.
Slope: m = deltaY / deltaX
line: y=mx+b
slope of a perpendicular line:
m_2 = - 1 / m_1
Find intersection point by solving for the 2 line equations together. That gives you the final X and Y.
Final R = atan(m)
*Note, you must prevent any divide by 0 errors. Those will occur anytime you have a vertical or horizontal line for one of the slopes. So if that happens I just set the R angle directly to 0 or 90 and directly set X and Y to one of the raw values.
-
What is giving the start signal, a PLC?
Double check that you are giving the correct signal for each group.
-
What value are you using to actually move the robot? Is it moving to the PLC calculated value based off found position only?
-
Build a fixture to position your parts when doing the reference, then document it.
-
Are you familiar with UI signals and PNS?
If not please read there manual. You will need to make sure the multi-UOP option is installed. Then you will see 2 sets of UI signals to start and stop each robot from your PLC.
In your post above you mention PNS, so I assume you already understand that. It isn't much different with 2 robots. Give it an honest try and post back if you have issues with what you specifically tried and what problems you encountered.
-
Dha gave you the answer. The programs will look the same as any normal program except you must set the motion group mask for either G1 or G2 in the program detail.
You will basically have 2 sets of UOP signals to start the programs.
-
Your explanation makes sense. I have not tried to export an image from IrVision. I think you would be better of using a 3rd party vision system in that case. Fanuc's IrVision is intended to do all the processing within the robot programming, but since you want to send that image out to be processed it doesn't really make sense to use integrated vision.
-
I looked at the other post and my insight is: I don't understand why you would want to do that. If it's just an academic exercise that's fine, but I would never use a cloud app to control the motion of a robot. I don't think it would be practical in the real world. The cloud app could be used for data collection, but sounds like a bad idea for motion control.
-
Use the following call to set a register value, in this example R[1], just replace the IP address with your robot:
http://127.0.0.1/karel/ComSet?s…alFlag=-1&sFc=2
Then have a BG Logic program monitor that register and if it changes value you can manipulate Flags. The flags can either be tied to UI signals to start the robot program or to a macro.
-
Try deleting the monitor instruction, then add it back.
Another solution would be to use BG Logic. It would be very simple to do so.
-
Is the existing battery low alarm signal not good enough for your purpose?