Posts by sweck

    Does anyone know which file(s) contain the collaborative settings? Are they included in the 2 DCS files listed above [DCSPOS.SV (Pos/Speed check), DCSIOC.SV (Safe IO Connect)]? I want to make sure I get those files when I transfer from RoboGuide.


    This was great frustration for me on a previous project. Newer versions (not sure which one) of Handling Tool allow use of Histogram without a parent locator, but as far as I know (Fanuc support concurred on this), before that update, histogram tool has to have a parent with 2D vision process.

    Can you use GPM on your outer box features as parent for histogram? This seems like best way anyway to make sure your histogram search area is properly located.

    Another alternative if you can't see outer box features is to use single view inspection visprocess instead of 2D vision process. Limitation I had with that is that it can't export values -- only pass/fail result.

    I had this issue trying when to pick very small parts (.120"x .040") so I was having trouble hitting the part at all. I would add to previous post to make sure you are defining a very accurate USER FRAME near the pick point. Even doing this and robot generated calibration and fine-tuning TCP and all the other suggestions in manuals didn't completely fix it for me.

    I had to use the ADJ_OFS command in the Vision Support Tools option (not too expensive...). You can read about it in the iRVision manual. Using it involves a trial-and-error setup process, but I did get it to work in the end.

    I highly recommend iRvision 2D for robot guidance. Pattern matching for guidance is about the only thing it does well, but it does it very well and the tool is very powerful. I have used it to pick parts down to .020"x .020" out of a field of view of 2", which is way less than the recommended resolution for that.

    I agree with previous poster that lighting and getting good contrast is the key to machine vision, but I have seen iRVision perform well even with non-ideal contrast. I also would recommend using a band-pass filter on the lens every time. Then you cut out most of ambient light and don't have to enclose a whole robot cell.

    It should have no problem with your pulley project unless your angle tolerances are higher than the camera resolution can handle. In that case you might need a 5MP or higher camera, which unfortunately is not an option with iRVision.

    I was thinking more of KAREL "script" functionality that you could call as a subprogram.

    For example, a program that would stop BG program 1 (not sure I'd ever want to do this but it's an example....). That script would be "MENU" "6" "0" "3" "F3". Code (using TPIN signal assignments) would look like this:


    But TPOUT only has ability to turn on the 3 LEDs on the TP. Is there another approach to doing this?
    I'm guessing there are all kinds of reasons why Fanuc would not supply this functionality, but thought I'd ask...

    Is there a way to use KAREL to run scripts or "macros" of keystrokes to access any screens/commands/settings available on the TP? I'm thinking something similar to a macro used in Excel. KAREL commands give access to a lot of screens & commands, but not all of them. This seems it would be an easy and comprehensive method.

    I see the TPIN command in the manual to read keystrokes, but didn't see a way to write or simulate TP keystrokes.


    I think that even a higher end inspection system would have trouble with this. (I have found that iRVision is very good at one thing--locating with pattern match--but not very good for general inspection.)

    Can you mechanically force all the washers down or up so none are tilted? Maybe better lighting would provide contrast to reliably differentiate the washers--maybe dome light. Or backlight to better see size.
    Just some ideas....good luck.