I moved it in world in just X and just Y, and the level stays the same both directions.
I've also tried a X direction in the user frame and it stays level.
I moved it in world in just X and just Y, and the level stays the same both directions.
I've also tried a X direction in the user frame and it stays level.
At each point the robot moves to. I'm using linear motion. The wpr of the points don't change, since I edit the position using only x, y and r in the position registers
I used the default tool frame, the first one with all values zeroed.
I have the correct user frame selected. I select both in the program using
UTOOL_NUM = 1
UFRAME_NUM=...
Hello,
I have some PRs I change the X and Y and R of, but the WPZ are the same. I checked the original PR, and everything is level with the ground, but when I go through the program, I check and the level has changed. By level I mean if I put a level on the faceplate it's perfectly level with the ground.
I only change the PR using PR[1,1] = ..., PR[1,2] = ..., etc.
I made a user frame by jogging straight in the +X and +Y in world frame.
I thought if you don't change the WPR of a position register, it would stay at the same level. It doesn't seem like that's the case though.
Is there a way to always keep the faceplate on the same level at all times while the robot moves through the motions?
Yes I just had one target. I printed it and measured it, and vision measurements were off at the level of calibration. The user frame, starting position of calibration, and the level at which I measured using vision at the end were all the same. The target's height/level was at the faceplate of the robot since the EOAT which I put the target on was flush with the faceplate.
Then I made a target vision 2d process tool, with the same user frame, and a Z height of 0.
I did a robot generated calib, so the EOAT had the target. The focal length was pretty accurate, 12.07 instead of exactly 12. Focal distance was also almost the same.
The target was just a circle with the X on it.
I calibrated at a closer level to the camera, about 400 mm, and the measurement was still off by 1-2 mm.
The minimum error value is .885 and max is 2. I've calibrated multiple times, but the error values are most of the time in that range
Hi,
I've done iRVision robot-generated calibration at a height of 55 mm of the workplace height with a printed target on an EOAT. The camera we use is a 1.3 MP camera at a height of around 500 mm. We're not getting accurate measurements, with measurements ranging from 1 to 2 mm off for X and 2-4 mm in Y.
I tried measuring at the level of calibration, and the measurement was still off by around 1 mm. The measurement is at around 450 mm from the camera.
Just wondering why. I was thinking it might be the focus of the camera
Thanks
Hello,
We have a LR Mate 200iD with a 30iB controller.
If I wanted to read DI / DO from a PC or Mac computer, and update too, how would I be able to do that?
I set up ethernet / ip with my PC, just put the ip address in and pinged it and that works fine.
I'm guessing something with socket programming but wasn't sure how.
Thanks.
Thank you so much! That worked.
Hello,
I was wondering how to disable singularity avoidance for a TP program. I think there's a header for it, but I'm not sure where to look. We have a LR mate 200iD with a R20iB controller. We're trying to use RTCP with a linear move, but it says
MOTN-203 error Not support AutoSA+RTCP
Thanks
The thing is, I need to change the axis of rotation depending on the workpiece it picks up. We know the measurements of the workpiece, and where the center of rotation should be offset from the center of the faceplate. I don’t have a pointer tool unfortunately, but I tried the 6 point method with a mark at the end of our tool
The tool is just an extension in the Z axis from the faceplate. The center is still lined up with the center of the faceplate.
I read up thanks. Seems like a tool frame is the way to go for what I want to achieve. The only problem is, when rotating about an off axis point, the robot drifts by one centimeter in the X direction. I used direct entry and just tried using the default 0 tcp with a change in Y.
I’ve seen posts about mastering causing drift issues, and I’ve done a quick master with the witness marks a month or two ago.
When rotating J6 in world, W and P do change by 1. Would that cause the drift?
The tool frame defines the location and orientation of your tool relative to the robot face plate. When your robot moves, the tool frame moves with it. This thread A Free Open-Source E-Book for HandlingTool TeachPendant Programming contains links to information that will help you better understand different frame types and what they're used for.
Thank you, it looks like there's a lot of information that'll be helpful.
I think you should be using a Userframe for for what you are talking about. The tool frames are for EOAT and as such will rotate their x,y planes with rotation of the robot's 6th axis. Userframes however will always be orientated in the same. An uninitialized userframe will match up to world frame exactly.
I tried this, I changed the Y of a user frame to be 200. When I switched to the uframe and jogged the robot in J6, it still rotated it the tool around its center.
I also tried it in a program.
PR[1,2] = 200
UFRAME[1] = PR[1]
UFRAME_NUM = 1
PR[2,6] = PR[2,6]+90
L PR[2]
It still rotates about the original center of the tool.
Hello,
We have an LR Mate 200iD with a R-20iB controller. I was trying to create a tool frame where the center point would be set with direct entry. The problem is when setting the point, we know the point's data how it would be relative to the world frame.
The tool frame isn't parallel to the world frame. When jogging the robot in +X, +Y, the tool frame is at an angle to the world frame. Is there a way to force it to be the same as the world frame? I tried direct entry with an angle offset, but it's giving me unpredictable results when I rotate about a point off axis.
Thank you.