Hello everybody, I have got a question regarding accuracy in jogging versus accuracy in automatic mode.
I am working with a VKRC2 running standard krc software kss4.97 with a kr210.
The thing i observed is that when i manually jog my robot over the product that i am working with (a rectangular box where i mill six pockets in to) i get a deviation of about 0.5mm, which is probably from the product and not the robot. So this is in my mind very good and so far i have no problems.
No when i run the program and the pockets are milled i have a difference from 1.5 to 2 mm in the depths of the pockets. so now i am wondering where this could come from.
First i checked the base i taught with a dial gauge and that seemed to be alright. both the X and Y axis (long and short side of the box) where between 0.1 mm (measured in Z). so that checks out
The next thing is checking if that base is being used by the program. This also is fine.
The last thing i checked was tool and loaddata, I have been using the same tool and load data for the last 2 years and here everything also seems to be fine. (it could be that the loaddata is not completely correct but i have never had problems with other products)
Now i basically have 1 question. The big difference between measuring the base while dragging a dial gauge over the product and thus checked that the angles of the base are correct and the milling program is that with jogging i have one smooth motion over the product versus the milling program that has to go up and down for every pocket. Could this up and down movement be so significant for my accuracy?
Or is the measurement that i do with jogging wrong and should i adjust the base until the automatic program is oke.
When it comes down to it, this is not a big problem for me because the fault is repeatable so with a simple base shift i can fix it. I am just curious about what is going on