iRVision measurements off

  • Hi,


    I've done iRVision robot-generated calibration at a height of 55 mm of the workplace height with a printed target on an EOAT. The camera we use is a 1.3 MP camera at a height of around 500 mm. We're not getting accurate measurements, with measurements ranging from 1 to 2 mm off for X and 2-4 mm in Y.


    I tried measuring at the level of calibration, and the measurement was still off by around 1 mm. The measurement is at around 450 mm from the camera.


    Just wondering why. I was thinking it might be the focus of the camera


    Thanks

  • What are your calibration results? Your mean and max error values should help you determine if camera calibration is off. Would a larger target help with calibration?


    In relation to focus of the camera, I believe the standard Kowa cameras optimal working distance is 400mm. I'm sure you can focus the camera to 500mm but something to consider.

  • I calibrated at a closer level to the camera, about 400 mm, and the measurement was still off by 1-2 mm.


    The minimum error value is .885 and max is 2. I've calibrated multiple times, but the error values are most of the time in that range

  • That is a very high error especially for a 400mm working distance. I would consider adjusting some of your physical limitations; larger cal grid, adjusting exposure, etc. Ensure your setup values are correct as well; focal distance (check what the system thinks with a measure tape), grid spacing value, etc.


    As a last resort, delete the high error calibration points as they will effect the overall average error.

  • I did a robot generated calib, so the EOAT had the target. The focal length was pretty accurate, 12.07 instead of exactly 12. Focal distance was also almost the same.


    The target was just a circle with the X on it.

  • Most likely a z height issue in you vision process setup and have nothing to do with calibration. Measure and adjust until it is accurate.


    Those calibration error values are in pixels not mm's and are good. No issue there.

  • Can you post a picture of your calibration grid?


    Was this just printed on a printer, or was it a commercial part provided by a reputable vendor?


    Printing scale when done on a generic printer can be way off. This is usually a software scaling problem. the easiest way to check would be to print off a circle or square of specified dimensions and check with an accurate ruler, or better yet, a caliper. A standard calibration grid could also be printed and verified with either method.


    Some print options that usually need to be investigated:


    Here, you can see the scaling is set to 100%:



    However, on this next tab, you can see there is another setting that can really screw with things:




    To solve this on my machine, I would just choose none:



    Another thing to always do is make sure your printer is set to the highest DPI available. On my printer, it is 1200 DPI, but many modern office machines go to 2400 DPI.


    Also make sure that economy mode or toner saver are turn off. Try to use the highest quality available.


    One last thing, inkjet printers will usually have much less accuracy than a laser printer. A laser printer on photo paper works great. Better yet, get a can of 3M Super 77 spray adhesive and affix it to a glass plate. Glass is incredibly flat.


    Commercially available calibration grids are expensive for a reason, but with these methods, you can approach or match their quality.

  • I agree with Hawkme, it is most likely the application Z setting in your vision process. Measure the height in userframe that you created for calibration. utilize the height of the surface you are looking at.

    this is a big one..


    Never use different height surfaces in your image, this will also do the above


    Other things are incorrect spacing value when using a grid.

  • Yes I just had one target. I printed it and measured it, and vision measurements were off at the level of calibration. The user frame, starting position of calibration, and the level at which I measured using vision at the end were all the same. The target's height/level was at the faceplate of the robot since the EOAT which I put the target on was flush with the faceplate.


    Then I made a target vision 2d process tool, with the same user frame, and a Z height of 0.

    pasted-from-clipboard.png

Advertising from our partners