1. Home
    1. Dashboard
    2. Search
  2. Forum
    1. Unresolved Threads
    2. Members
      1. Recent Activities
      2. Users Online
      3. Team Members
      4. Search Members
      5. Trophys
  3. Articles
  4. Blog
  5. Videos
  6. Jobs
  7. Shop
    1. Orders
  • Login or register
  • Search
Everywhere
  • Everywhere
  • Articles
  • Pages
  • Forum
  • Blog Articles
  • Products
  • More Options
  1. Robotforum - Support and discussion community for industrial robots and cobots
  2. Members
  3. rumblefish

Posts by rumblefish

  • RC9 Safety Parameters

    • rumblefish
    • May 16, 2024 at 10:05 PM

    So I've got a new VL with a RC9 controller. I finally muddled through the initial setting file and am trying to transfer safety parameters to the robot. The problem is the transfer errors out every time. I transfer through wincaps and then try to transfer the safety using the "Safety Parameter Tool" to no avail. Both transfers are to the same controller IP address. I have found a document on Denso's KB that states the safety motion board uses a different IP but cannot find any other documentation that gives any further detail. Am I going about this the correct way or am I way off base? :wallbash::computerhit:

    *** UPDATE *****

    Apparently the safety board is the unmarked RJ45 bulkhead connector. Awesome.

  • Split register value into two 16bit values for group outputs

    • rumblefish
    • March 16, 2020 at 4:20 PM

    If you have the math option in the robot it helps. In the plc we take a Real number and create two INT's; one left side of decimal, the other right side of decimal. Send those values via GI to the robot, in the robot the number gets reassembled in BG logic. New number = Left side + (Right side/1000). For negatives the plc sets a DI and the math just multiplies by -1.

  • Grabbing Joint Position at High Speed

    • rumblefish
    • February 21, 2020 at 5:18 PM

    Have you tried using bg logic to capture the sysvar for the joint pos? I believe the bg logic scans at 8ms or so. You'll probably still have to compensate for joint speed reaction time.

  • Inertia Values For Payload

    • rumblefish
    • February 21, 2020 at 4:04 PM

    Is the calibrations status "Done" in payload IDENT? I've found when using IDENT if the payload isn't calibrated w/o an eoat for a baseline it won't take new data. Run it w/o tool and enable calibration mode.

  • ProFace HMI panels with R30iB

    • rumblefish
    • January 7, 2020 at 5:11 PM

    For DI/DO there is no setup, it's addressing on the pro face side

    DI1 > %Q1

    DO1 > %I1

    Registers will have to have SNPX assignments made. This where the Fanuc HMI manual comes in to play.

    SNPX uses R(registers) but they are not related to the numeric "R" registers on the data screen.

    In a snpx assignment you assign:

    Address - The start of the snpx register to use

    size - how many snpx regsisters are required

    Var_Name - robot data to read and the format. ie PR , R , SysVariable etc

    multiply - decimal precision/place basically

    I highly advise reading the Fanuc HMI User manual. Here's a quick setup for reading regisers.

    Keep in mind, proface will use the address of 5000 (%R5000). It can be changed to 1.

    ASG 8 refers to the sysvar $SNPX_ASG[8]. This is where you set your snpx assignments .You'll see there are quite a few. I just happened to use 8 for this example. It can be changed to asg 1.

    ASG 8 Address 5000
    Size 50
    Var Name R[150]' Data Register 150-175
    Multiply 1.0 32b Signed INT

    This should get you started.

  • ProFace HMI panels with R30iB

    • rumblefish
    • January 6, 2020 at 4:15 PM

    Comm setup or the snpx assignment structure? I've previously posted a pic of a comm setup on the proface side.

  • Roboguide How to making Machines move?

    • rumblefish
    • December 10, 2019 at 10:30 PM

    Make the table a "Machine", add servo. "Machines" can have simulated motion. You can then command motion and even use DO to control it. You might make it a conveyor type if you need it to spin continuously.

  • Fanuc Mastering, Vision, 7th Axis Questions

    • rumblefish
    • December 10, 2019 at 9:45 PM

    Yes, AGFS is auto grid frame set. It won't move the 7th axis, however it is a much more accurate method of setting frames and cam calibration. The big issue is maintaining the working distance with the camera. (Cam to part distance) The type of vision process you use will also be another factor. I went through 3-4 vp's until I found one that worked for my app. I didn't need 6 degrees of freedom in the offset. Hopefully this helps.

  • Fanuc Mastering, Vision, 7th Axis Questions

    • rumblefish
    • December 10, 2019 at 7:01 PM

    You can master the 7th axis where ever you want. Just be sure to record/mark that position on the rail. Verify the gearing for the 7th axis. If you have it set to extended integral you should be able to jog in world/tool and move the carriage (7) with the tcp remaining constant.

    Regarding iRVision, if it is a robot mounted camera be sure to use AGFS to set cam cal and uframes.

  • ACTIVE UFRAME

    • rumblefish
    • December 4, 2019 at 4:33 PM

    $MNUFRAMENUM[G] will show the active uframe. G being the group number.

  • Interconect Nsi with DO

    • rumblefish
    • December 4, 2019 at 12:33 AM

    The IO will config as DO like this:

    Rack 36

    Slot 0

    Start 1 (depending on what you're starting with)

  • Position Transform

    • rumblefish
    • December 3, 2019 at 2:31 PM

    So just to update, it turns out the math was/is correct. The X axis error was due to inconsistencies in the locating. I was using GEDIT to find circles, however the vision process cannot find the true center automatically. The org was drifting and not repeating from snap to snap. This was confirmed with FANUC as well. For whatever reason the gedit processes differently than a "taught" image with actual contrast lines.

  • fanuc and IO link

    • rumblefish
    • December 3, 2019 at 1:17 AM

    Gotcha,

    Usually the common name is a stumbling point for most.

  • fanuc and IO link

    • rumblefish
    • November 26, 2019 at 8:58 PM

    What you are referring to are two different "IO Link" busses. The popular sensor bus "IO Link" is a device level protocol. FANUC's IO Link is a proprietary serial bus. They are two separate protocols.

  • HMI Device option (R553) for fanuc robot

    • rumblefish
    • November 25, 2019 at 3:58 PM

    This is the SPNX HMI Device option, it's a paid option. You'll need to buy it and get a PAC code from FANUC. It's simple to install, only takes a few minutes.

  • Position Transform

    • rumblefish
    • November 22, 2019 at 8:37 PM
    Quote from HawkME

    You say the the vision offset is in world, then later you state the part is located in UF[2]. Which is it?

    Sorry, it's a work in progress. It was world but I changed it to UF2. World was giving me fits, robot is upside down on a rail and rotated 90 from the UF's.

    I had offset only due to being lazy, it is a 2d found pos process so its still spits out found pos in offset.

    Quote from Robot Programmer

    $MNUserFrame

    I think if you added the current user frame, then subtracted the desired user frame. Haven't ever done this, but I think that would work. Let me know if it does.... :smiling_face:

    Thats what I'm doing, taking the difference between the two frames and then multiplying against found pos.

  • Position Transform

    • rumblefish
    • November 22, 2019 at 2:50 PM

    I'm trying to figure out to to convert a cart pos from one frame to another.

    I have an application which is using a rob mounted cam to locate a product on tray. The tray is a calculated user frame, so the uframe can change based on the tray. The vision process is setup to use world for the offset frame. The issue is that the robot is told where to go on the tray by a parent system and it needs to report back the found pos to the parent. My issue is in trying to convert the found pos in world to the tray uframe. I was trying to take the inverse of the uframe and matrix against the world found pos but am not having any luck.

    Does anyone know of a solution for this? I need a nudge in the right direction but I'm not opposed to just telling how to do it either:winking_face: Any help is appreciated!

    Here's the current logic.

    The part is located in UF2 but needs to be pick in UF3

    PR[228]=VR[1].OFFSET (Found Pos Vis Proc)

    PR[230]=UF[3] (UF3 Changes based on cart geometry)

    PR[230],13=0 (Extended Rail 7th Axis)

    CALL INVERSE (230,230)

    PR[233]=UF[2] (Static, never changes)

    PR[233],13=0

    CALL INVERSE (233,233)

    PR[234]=PR[230]-PR[233]

    CALL MATRIX (234,228,60,160)

    I'm off in the X axis and not sure why


    PS. The reason inspecting this in world is due to error in the original setup. I changed the vis process offs frame to a static frame that doesn't change.

  • Ethernet IP Topology Proposal

    • rumblefish
    • November 15, 2019 at 9:24 PM

    I would just get a smart/managed switch and plug everything in to that. That is the topology for almost every EIP network I've done. Balluff BNI blocks allow you to physically daisy change the network connects.

    EIP connects like a normal CATV network and can coexist with other protocols on a network.

    Sounds like your on the right track now.

  • Ethernet IP Topology Proposal

    • rumblefish
    • November 15, 2019 at 2:30 PM

    What is the "Product Coupler"? If it is a network switch, there shouldn't be any issue with EIP. If you're trying to daisy chain devices, it then becomes dependent on device passthrough for the network connection.

    Did the supplier give a reason as to why it won't work?

    Balluff is a good product, I have had success with them in the past. Allen Bradley 1734 I/O works well with the robot as well.

  • Palletizing

    • rumblefish
    • November 12, 2019 at 5:12 PM

    I don't remember specifically if it is due to crossing an axis or lack of turn data, maybe someone else can better explain it. It still reaches the cartesian point, however (for example) J4 could flip. The tcp is still at the position, just with a different config.

    When you say "not going to exact position", do you mean the calculated position vs the displayed position are different? Or is it the robot didn't line up with a part/fixture?

    If it is the latter, you need to verify stack up tolerance on the trays/fixtures. If the destination has 20 spots and they were built +/-50um, you could have up to a mm in error at the end. You can't expect a robot to pick +/- .5 mm if the tray is +/- 2mm.

Advertising from our partners

IRBCAM
Robotics Channel
Robotics Training
Advertise in robotics
Advertise in Robotics
Advertise in Robotics
  1. Privacy Policy
  2. Legal Notice
Powered by WoltLab Suite™
As a registered Member:
* You will see no Google advertising
* You can translate posts into your local language
* You can ask questions or help the community with your knowledge
* You can thank the authors for their help
* You can receive notifications of replies or new topics on request
* We do not sell your data - we promise

JOIN OUR GREAT ROBOTICS COMMUNITY.
Don’t have an account yet? Register yourself now and be a part of our community!
Register Yourself Lost Password
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on Google Play
Robotforum - Support and discussion community for industrial robots and cobots in the WSC-Connect App on the App Store
Download