1. Home
    1. Dashboard
    2. Search
  2. Forum
    1. Unresolved Threads
    2. Members
      1. Recent Activities
      2. Users Online
      3. Team Members
      4. Search Members
      5. Trophys
  3. Articles
  4. Blog
  5. Videos
  6. Jobs
  7. Shop
    1. Orders
  • Login or register
  • Search
This Thread
  • Everywhere
  • This Thread
  • This Forum
  • Articles
  • Pages
  • Forum
  • Blog Articles
  • Products
  • More Options
  1. Robotforum - Support and discussion community for industrial robots and cobots
  2. Forum
  3. Industrial Robot Support and Discussion Center
  4. Fanuc Robot Forum
Your browser does not support videos RoboDK Software for simulation and programming
Visit our Mainsponsor
IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Sponsored Ads

Cognex Vision integration with Fanuc

  • Initial-Y
  • July 19, 2016 at 3:55 AM
  • Thread is Resolved
  • Initial-Y
    Trophies
    3
    Posts
    5
    • July 19, 2016 at 3:55 AM
    • #1

    Hi Guys,

    I need some help on how could I program Fanuc robot to accept a Vision output offset value if the vision is from third party like Cognex instead of iRVision.

    Thank in advance! :help:

  • DaveP
    Reactions Received
    23
    Trophies
    4
    Posts
    201
    • July 19, 2016 at 1:49 PM
    • #2

    check out this thread. https://www.robot-forum.com/robotforum/fan…77106/#msg77106

    I was able to get the robot and Cognex system communicating using those notes and the Cognex documentation. But, we had to use the CIO-Micro I/O for the project because the Cognex tools are limited when you use Ethernet/IP to communicate with the Fanuc Robot. It will send X, Y, and angle of a position very easily. I needed more options for that particular project.

  • locologic
    Trophies
    3
    Posts
    7
    • July 20, 2016 at 4:35 PM
    • #3

    Has anyone used a third party camera (cognex, dvt, etc) or sent a position from a pc to a fanuc robot for visual line tracking? Is it possible to synchronize the third party camera offset with the track sensor in the fanuc to know when the upstream picture was taken to know where the product is now? I was thinking possibly to use a photoeye to both trigger the camera and start the conveyor tracking, but since there could be many parts on the conveyor at a time this doesn't seem like the best option.
    Thanks.

  • dmbj
    Reactions Received
    14
    Trophies
    3
    Posts
    202
    • July 20, 2016 at 4:52 PM
    • #4
    Quote from locologic


    Has anyone used a third party camera (cognex, dvt, etc) or sent a position from a pc to a fanuc robot for visual line tracking? Is it possible to synchronize the third party camera offset with the track sensor in the fanuc to know when the upstream picture was taken to know where the product is now? I was thinking possibly to use a photoeye to both trigger the camera and start the conveyor tracking, but since there could be many parts on the conveyor at a time this doesn't seem like the best option.
    Thanks.

    There is no reason that it should not work.
    run a set trig command on the line tracking
    run your vision, if you find something go get it if not start over.

    i have found you should run your set trig before run find (in irvision) for better results. if your vision process takes more than 500ms you will run into problems

  • locologic
    Trophies
    3
    Posts
    7
    • July 21, 2016 at 4:11 PM
    • #5

    Where would you apply the offset found by the third party camera (a user frame, an offset pick position, etc.) and how would you know which part in the conveyor tracking queue goes with which offset found by the third party camera by the time the part reaches the robot if there are multiple parts between the camera and robot?
    I hope that makes sense.

  • mikeandersontx
    Trophies
    3
    Posts
    7
    • August 8, 2016 at 10:36 PM
    • #6
    Quote from locologic


    Where would you apply the offset found by the third party camera (a user frame, an offset pick position, etc.) and how would you know which part in the conveyor tracking queue goes with which offset found by the third party camera by the time the part reaches the robot if there are multiple parts between the camera and robot?
    I hope that makes sense.

    When I did this using a Cognex camera, I created a FIFO stack on the robot. I had a background task that would push data onto the stack and move the push pointer, as each new vision snap occurred. I also had a pop pointer that would allow me to always pop the proper position data. This was called when I needed new vision data. This was a simple 10 position register setup for each of the X, Y, Z (from an external sensor) and R values. It may have been cleaner to use PRs, but I was in a hurry and didn't know better, so I used individual registers for the data.
    I applied offset data to the user frame for the X and Y positions. I applied Rz data to the tool frame in R.

  • lunknowl
    Trophies
    3
    Posts
    73
    • October 12, 2016 at 4:25 PM
    • #7
    Quote from mikeandersontx

    When I did this using a Cognex camera, I created a FIFO stack on the robot. I had a background task that would push data onto the stack and move the push pointer, as each new vision snap occurred. I also had a pop pointer that would allow me to always pop the proper position data. This was called when I needed new vision data. This was a simple 10 position register setup for each of the X, Y, Z (from an external sensor) and R values. It may have been cleaner to use PRs, but I was in a hurry and didn't know better, so I used individual registers for the data.
    I applied offset data to the user frame for the X and Y positions. I applied Rz data to the tool frame in R.


    Can i get the background task to push data onto the stack? or any information for to do it? thanks

  • mikeandersontx
    Trophies
    3
    Posts
    7
    • October 19, 2016 at 11:31 PM
    • #8

    Can i get the background task to push data onto the stack? or any information for to do it? thanks
    [/quote]

    This is actually a RUN, not BGLogic. R[321]-R[325] are the height values I was reading in from the sensor (analog light curtain).

    1: R[320:HeightReg]=0 ;
    2: R[321:CurHeight0]=R[281:SensorMin] ;
    3: R[321:CurHeight0]=R[281:SensorMin] ;
    4: R[322:CurHeight1]=0 ;
    5: R[323:CurHeight2]=0 ;
    6: R[324:CurHeight3]=0 ;
    7: R[325:CurHeight4]=0 ;
    8: R[319:HeightIndex]=0 ;
    9: ;
    10: LBL[1] ;
    11: IF R[299:STOP_TASKS]=1,JMP LBL[999] ;
    12: IF (F[18:CamSnapped]=ON AND F[15:VisPop]=OFF),JMP LBL[200] ;
    13: IF (F[7:LC Trigger]=ON),JMP LBL[100] ;
    14: ;
    15: JMP LBL[1] ;
    16: ;
    17: LBL[100:init read] ;
    18: R[283:LargestReading]=R[281:SensorMin] ;
    19: LBL[110:read height] ;
    20: IF (F[7:LC Trigger]=OFF),JMP LBL[190] ;
    21: IF (R[282:SensorReading]>R[283:LargestReading]),R[283:LargestReading]=(R[282:SensorReading]) ;
    22: JMP LBL[110] ;
    23: ;
    24: LBL[190:end read] ;
    25: R[320:HeightReg]=321+R[319:HeightIndex] ;
    26: R[R[320]]=R[283:LargestReading] ;
    27: IF (R[319:HeightIndex]<4),R[319:HeightIndex]=(R[319:HeightIndex]+1) ;
    28: JMP LBL[1] ;
    29: ;
    30: LBL[200:Pop Off Que] ;
    31: R[321:CurHeight0]=R[322:CurHeight1] ;
    32: R[322:CurHeight1]=R[323:CurHeight2] ;
    33: R[323:CurHeight2]=R[324:CurHeight3] ;
    34: R[324:CurHeight3]=R[325:CurHeight4] ;
    35: R[325:CurHeight4]=0 ;
    36: ;
    37: R[319:HeightIndex]=R[319:HeightIndex]-1 ;
    38: IF (R[319:HeightIndex]<0),R[319:HeightIndex]=(0) ;
    39: WAIT R[287] ;
    40: F[15:VisPop]=(ON) ;
    41: JMP LBL[1] ;
    42: LBL[999] ;

    Hopefully this will help you a bit. May not be the best way, but it was a quick way to make it work.

  • mikeandersontx
    Trophies
    3
    Posts
    7
    • October 19, 2016 at 11:39 PM
    • #9

    If you want to swap the order of removal of data, set up another pointer (I used R[319] for the first one). You can write with one and read with the other (R[318] for instance).
    Increment push and pop counters independently and you can retrieve data FIFO or FILO. If you need more help, let me know.

  • lunknowl
    Trophies
    3
    Posts
    73
    • October 20, 2016 at 5:01 PM
    • #10

    how is your program work with line tracking? dont you need to keep track of the encoder count too? and how do you trigger the line trackig?

  • dangalg
    Trophies
    3
    Posts
    37
    • August 9, 2018 at 6:36 AM
    • #11

    http://www.cognex.com/support/downlo…tComms_Fanuc_v1[1].doc

Advertising from our partners

IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Advertise in Robotics
Advertise in Robotics

Job Postings

  • Anyware Robotics is hiring!

    yzhou377 February 23, 2025 at 4:54 AM
  • How to see your Job Posting (search or recruit) here in Robot-Forum.com

    Werner Hampel November 18, 2021 at 3:44 PM
Your browser does not support videos RoboDK Software for simulation and programming

Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • robodk
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000

Thread Tag Cloud

  • abb
  • Backup
  • calibration
  • Communication
  • CRX
  • DCS
  • dx100
  • dx200
  • error
  • Ethernet
  • Ethernet IP
  • external axis
  • Fanuc
  • help
  • hmi
  • I/O
  • irc5
  • IRVIsion
  • karel
  • kawasaki
  • KRC2
  • KRC4
  • KRC 4
  • KRL
  • KUKA
  • motoman
  • Offset
  • PLC
  • PROFINET
  • Program
  • Programming
  • RAPID
  • robodk
  • roboguide
  • robot
  • robotstudio
  • RSI
  • safety
  • Siemens
  • simulation
  • SPEED
  • staubli
  • tcp
  • TCP/IP
  • teach pendant
  • vision
  • Welding
  • workvisual
  • yaskawa
  • YRC1000
  1. Privacy Policy
  2. Legal Notice
Powered by WoltLab Suite™
As a registered Member:
* You will see no Google advertising
* You can translate posts into your local language
* You can ask questions or help the community with your knowledge
* You can thank the authors for their help
* You can receive notifications of replies or new topics on request
* We do not sell your data - we promise

JOIN OUR GREAT ROBOTICS COMMUNITY.
Don’t have an account yet? Register yourself now and be a part of our community!
Register Yourself Lost Password
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on Google Play
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on the App Store
Download