Hi George, contact me through my linkedin https://www.linkedin.com/in/robnier-reyes-perez-17b11662/
Posts by rreype314
-
-
I am trying to phase out the use of the smartpad and while the manual shows clearly how to disconnect the smartpad it does no go into how to run a robot application without the smartpad.
Furthermore, even if I run an application, then disconnect the smartpad, if I try to do handGuiding(), I cannot press the play button that would allow me to move the robot after the enabling device is active.
I could connect a monitor, mouse and keyboard to the cabinet controller and use the smartHMI from there, but that is not the solution I am looking for. I am trying to phase out the use of the smartpad, including the smartHMI software.
Is there a way to have a robot application execute after start-up? Without having to select it from the drop down menu on the smartpad?
-
Have you configured your handGuiding enabling device? Do you have a switch mechanism connected to x11?
-
-
The DataRecorder method wouldn't work since you have to stop the application, grab the data, upload it to a new application.....unless you are ok doing that.
Can you talk a bit more about how you want the Teach by Demonstration application to work?
-
Is there a way to command a rotation motion as a quaternion to the LBR?
I am currently converting from quaternion to Euler angles, but I would like to avoid this step.
-
The gravitational pull is the same but that is not what you are getting from the force of the TCP z-axis when there is something grabbed by the Tool.
AdvikRasha is in the right track, you have to think of torque. Are you picking up the same object every time?
-
As a note on the same topic: port numbers are also a source of problems, you can only use port numbers between 30,000-30,010
-
For safety reasons there is no way enter hand-guide mode, which is the ability to freely move any joint to any angle within its range of motion, without pressing the enabling switch.
That being said, you can use impedance mode with low stiffness and damping parameters to "freely" move the robot. Unlike true hand-guide mode, this method wont allow you to move anywhere in robot space.
THIS METHOD IS HIGHLY DANGEROUS!!!
Under impedance mode, the end-effector acts as a spring. If the tool load data hasn't been entered, the motion of the robot WILL become unpredictable.
-
Are you talking about TCP calibration?
You can definitely design and implement your own calibration application with hand-guide but that will certainly take some time and expertise.
Try to be efficient picking the 4 different poses you need for the tcp calibration method.
My approach is to place the TCP above the calibration target/point and apply a 10-20 degrees rotation about one axis, save the frame, come back to the previous position, and now apply a rotation on a different axis, save frame, and so on.
This should shortened the time it takes to get four different frames that are different enough for the calibration software to work.
-
KUKA documentation explains that if the tool load is less than 1kg, the calibration process is not very accurate.
-
This is the most basic concept in robotics. Use the kinematics of the robot to obtain the individual joint angles that will take the robot to the desired cartesian point in space, and vice versa.
-
Hi,
I am starting to implement a visual servoing task and I am trying to understand the difference between the SmartServo classes and the FRI Command mode.
My goal is to use a laser rangefinder mounted on the robot to keep it at a fixed distance from a surface. The robot would move along the surface and change its altitude (i.e. position along the z-axis) based on the surface irregularities.
The overall solution would be to use FRI to transfer data in and out of the controller while using the SmartServo motion classes to change the robot trajectory. However, the FRI package has a Command mode that can also be used to change the robot path.
So, the question: Why the different methods to accomplish the same task? I am sure there are valid differences and trade-offs to be noted but there are not clear to me.
-
-
Hello everyone,
I am running Sunrise OS Med 1.0 and WoV Med 3.0.6
I am having trouble using the EK1100 coupler with input and output terminals.
As you can see in the screenshot, I seem to be missing the Sunrise I/O tab. Not sure how to approach this problem.
-
I am exploring the same problem. The getExternalTorque method from the ITorqueSensitiveRobot interface ignores weight(gravity) and inertia of the robot. Now, the DataRecorder class seems to take into account both gravity and inertia. Since you are moving in the X direction, your Y and Z reading will be stable relative to X reading.
-
I am using Windows 7 Ultimate edition with 64-bit. This is a compatible environment.
Not sure what KSS is.
The screenshot does not show any project because there aren't any. You cannot import a device config file if you have a project open. -
There is a kinematics c++ library for the iiwa provided by KUKA. Reach out to them.
-
Hey, can you elaborate on that? I am using Sunrise OS Med 1.0 and WoV Med 3.0.6
I have a new LBR Med and these are the softwares that came with it. -
I installed WorkVisual today, I installed the Sunrise option package, and updated the DMTcatalog. Then moved to import the device config files and