can you be more detail? map flag setup : menu -- i/o -- flag. how i map to ui
Posts by lunknowl
-
-
hi, is it possible to link register data [R1] to user input [UI:1] to simulate the user input
UI:1 = R1 -
1) the manual sample using input to trigger encoder value? i want to know can variable trigger without input
2) i also want to know the sequence of setup line tracking? the manual show all the option but didnt say with step i need to do first.
i just want a simple line tracking with fifo (i already have BG for that). -
can you be more specific how to use register for the encoder trigger. thabks
-
-
hi is there anyone use line tracking with multiple buffer. i dont want to use camera because i dont need that fancy option. i have buffer program that get x y angle and encoder count. but i dont know how to setup line tracking using that buffer
-
-
-
-
-
-
is there any one know how to use telnet to chang register data?(ex. r1=1) thanks
-
1) I connect M96 robot to the S4C controller and alway have a error 20252 motor temp high but the motor still cold. Does anyone know how to jumper the ptc sensor out?
2) Can I use the Irb 340 motor with ultra 3000?
note: I already call ABB support but they didnt reply yet. I just want to know is this possible or someone already try it because this is what I have in stock. thanks for reading -
It's certainly possible to reinvent the wheel, but it would be much easier to just get the iRVision 2D Visual Tracking option.Is your goal to simply get the system to work or to gain a very in-depth and academic understanding of how visual tracking works?
If you enjoy the prospect of the academic exercise:
1. Write a background "sensor task" that triggers your 2d single view vision process every so often, either on a time delay, or every X distance of conveyor travel (better).
2. Associate/save the current encoder count with this set of offsets (you'll need it later to figure out how far the parts have traveled along the conveyor)
3. Translate your vision offsets from the camera's calibration frame into your tracking frame.
4. Put these vision offsets into some sort of queue e.g. ( [{ encoder_count: 10000, vision_offset: { x: 0, y: 0, r: 0 }}, { encoder_count: 10000, vision_offset: { x: 10, y: 5, r: 45}}] )
5. Write a routine that receives offsets from the queue when they are within the robot's tracking boundaries, populating a vision register with the offset data
6. Make sure to write a couple maintenance routines that clean up your queue (get rid of parts that pass beyond all boundaries, parts that were seen by the camera twice, etc.)how can we set queue to use for vision?
-
can we simulator the data by using register? so everytime we trigger by the sensor, we send register data to buffer/ queue without using vision. thanks
-
how is your program work with line tracking? dont you need to keep track of the encoder count too? and how do you trigger the line trackig?
-
is there away to setup a vision sensor that will send out data 0,0,0 to simulator the line tracking?
-
hi, I use the input/sensor to trigger the line tracking conveyor. Is there a way to do a buffer/fifo/queue? Right now the robot only pick object at a time.
-
hi, i just want to know if anybody know how to transfer the fanuc option? i have two robots, robot 1 with option 1 and robot 2 with option 2. I just want to transfer option 2 to robot 1 so that robot 1 have both options and robot 2 blank. Usually on the ABB that we can manually transfer the license without cost anything but on fanuc robot it more money for license tranfering than buy the option with doesnt make sense
-
When I did this using a Cognex camera, I created a FIFO stack on the robot. I had a background task that would push data onto the stack and move the push pointer, as each new vision snap occurred. I also had a pop pointer that would allow me to always pop the proper position data. This was called when I needed new vision data. This was a simple 10 position register setup for each of the X, Y, Z (from an external sensor) and R values. It may have been cleaner to use PRs, but I was in a hurry and didn't know better, so I used individual registers for the data.
I applied offset data to the user frame for the X and Y positions. I applied Rz data to the tool frame in R.Can i get the background task to push data onto the stack? or any information for to do it? thanks