Hi,
At my customer I changed the robot configuration, like I did in RoboGuide, and now the real robot has it's extended axis set as an 'integrated rail axis' (not as an auxiliary rail axis). The robot moves to the correct X-position and from there it can follow the curves.
One more (/last) step to take:
In the files generated by RoboDK, some positions are expressed in joint-coordinates and most in Cartesian coordinates. In the following text I look at two positions, which are close to each other (in the real world):
- P[1] expressed in joint coordinates (J move),
- P[2] expressed in Cartesian coordinates (L move).
With the modified controller settings, both positions are now reachable for the robot but positions expressed in joint coordinates are positioned at the other side of the rail than the coordinates that are expressed in Cartesian coordinates. See the attached image "P1andP2.jpg". In this image I converted the expression of P[2] into Joint coordinates so it can be compared to with P[1] more easily. J1 makes a rotation of approximately 180 degrees in order to move from P[1] to P[2].
When both positions are converted to Cartesian coordinates the problem seems to be in the Z-coordinate:
P[1]: Z = 2951 mm (not the correct position)
P[2]: Z = 302 mm (correct distance w.r.t. UserFrame_9)
See image "cartesianP1andP2.jpg".
Tests with the real robot show similar results only there the problem is inverted: the joint positions are at the right side of the rail and the Cartesian positions are not.
In RoboGuide:
The UF_9 seems to be shifted in Z-direction over a distance of (2951 - 302 =) 2649 mm.
In the real robot:
Jogging in UF_9 results in inverted directions for X and Z, so the frame seems to be rotated 180 degrees around the Y-axis (of UF_9) and is probably also shifted.
I'm looking for the cause(s) for the differences between the simulation in RoboDK, RoboGuide and the behaviour of the real robot.
Probably involved are:
- The coordinate-system of the 3D model of the robot(base plate) in RoboGuide is 180 rotated w.r.t. the 3D model in RoboDK. In both systems I chose the X-directions of the robot-frames in the same direction (but the connectors of the base plates are at opposite sides).
- The real robot is mounted 90 degrees rotated w.r.t. the 3D model in RoboDK but it's calibration is (historically) such that the arm direction at J1=0.0 is the same as in RoboDK with J1=0.0.
If you have any suggestions, I'm glad to hear from you!
At my customer I changed the robot configuration, like I did in RoboGuide, and now the real robot has it's extended axis set as an 'integrated rail axis' (not as an auxiliary rail axis). The robot moves to the correct X-position and from there it can follow the curves.
One more (/last) step to take:
In the files generated by RoboDK, some positions are expressed in joint-coordinates and most in Cartesian coordinates. In the following text I look at two positions, which are close to each other (in the real world):
- P[1] expressed in joint coordinates (J move),
- P[2] expressed in Cartesian coordinates (L move).
With the modified controller settings, both positions are now reachable for the robot but positions expressed in joint coordinates are positioned at the other side of the rail than the coordinates that are expressed in Cartesian coordinates. See the attached image "P1andP2.jpg". In this image I converted the expression of P[2] into Joint coordinates so it can be compared to with P[1] more easily. J1 makes a rotation of approximately 180 degrees in order to move from P[1] to P[2].
When both positions are converted to Cartesian coordinates the problem seems to be in the Z-coordinate:
P[1]: Z = 2951 mm (not the correct position)
P[2]: Z = 302 mm (correct distance w.r.t. UserFrame_9)
See image "cartesianP1andP2.jpg".
Tests with the real robot show similar results only there the problem is inverted: the joint positions are at the right side of the rail and the Cartesian positions are not.
In RoboGuide:
The UF_9 seems to be shifted in Z-direction over a distance of (2951 - 302 =) 2649 mm.
In the real robot:
Jogging in UF_9 results in inverted directions for X and Z, so the frame seems to be rotated 180 degrees around the Y-axis (of UF_9) and is probably also shifted.
I'm looking for the cause(s) for the differences between the simulation in RoboDK, RoboGuide and the behaviour of the real robot.
Probably involved are:
- The coordinate-system of the 3D model of the robot(base plate) in RoboGuide is 180 rotated w.r.t. the 3D model in RoboDK. In both systems I chose the X-directions of the robot-frames in the same direction (but the connectors of the base plates are at opposite sides).
- The real robot is mounted 90 degrees rotated w.r.t. the 3D model in RoboDK but it's calibration is (historically) such that the arm direction at J1=0.0 is the same as in RoboDK with J1=0.0.
If you have any suggestions, I'm glad to hear from you!