On a robotic +vision application we always put the camera far away from the pick robot (usually a conveyor).
I know that there many type of lenses (50mm,30.5mm,.....).
Also I know how big my filed of view should be (based on robot reach)
Sometimes the camera have to be few meters away to avoid the fisheye effect and therefore obtain "wrong" coordinates from the vision
The question is Is there a formula that allows to calculate the error between real dimensions and fisheye dimensions ?
I mean, I calibrated the camera using a grid, I know the distance from the camera to the conveyor, I know the size of the field of view, I know the lens parameter . It's got to be a formula out there that I could use . Obviously I Googled it but I couldn't find anything specifically for what I need.