Robot and vision, motion offset based on BaseX coordinate system

Robot coordinate offset based on movable coordinate
system In the production process, when multiple coordinate systems need to be built, we can use coordinate system offset to realize the linkage of multiple coordinate systems directly using the robot coordinate system.
Due to the linkage of multiple coordinate systems, it is usually accompanied by the movement and rotation of the coordinate system, so we will set up two robot coordinate systems, one is the working coordinate system of the robot hand, which is baseX; the other is the base of the robot hand The reference coordinate system is base0;
insert image description here

As shown in the figure above, base0 is the coordinate system of the base of the robot, which is the coordinate system of the robot and is not affected by any control. The baseX coordinate system is the working coordinate system of the robot arm, which is directly determined during teaching, and the baseX point is the point under the base0 coordinate system. The base1 of the robot walks to the position where XY is equal to 0, then finds the position of base1 on the object, draws a cross, turns on the camera to collect in real time, moves the robot to the position of the cross, and the position of the cross can be clearly seen in the camera screen, and the Align the center of the image with the cross, and then record the mechanical coordinates under the current base0 as the coordinates of teach0. The left picture is the teaching workpiece, the right picture is the working workpiece in red, and the black is the teaching coordinate system after rotation. Blue is the offset X distance, offset Y distance, and offset angle Angle that we finally sent to the electrical.
Step 1: Confirm teaching. After the calibration is completed, the robot should determine a working coordinate system at the teaching position (the working coordinate system is recommended to use the center of the end of the robot hand as the baseX coordinate system).
Step 2: Determine the coordinates of the teaching point. The coordinates of the current teaching point should reach the position of the workpiece through the electrical use of the robot arm, and then record the coordinates of the teaching point based on the base0 coordinate system (that is, the point in the global coordinate system of the robot arm). At the same time, it is also necessary to record the coordinates of the teaching point in the camera coordinate system.
[When determining the teaching point, the center of the camera should be facing the teaching point. When identifying the working coordinates, you can get the specific value of the working coordinates according to the offset]
insert image description here

As shown in the picture above: the two square frames are the camera's photo screen, so we should place the teaching point at the center of the screen when teaching.
Step 3: Obtain the working coordinates based on the base0 coordinate system when working.
Proposal: camera.work.x: the X coordinate of the working point in the camera coordinate system (can be directly obtained by taking pictures); camera.work.y: the Y coordinate of the working point in the camera coordinate system (can be directly obtained by taking pictures), then based on
Base0 The working coordinates can be: base0.work.x=base0.Teach.X+(camera.work.x-camera.Teach.X); base0.work.y=base0.Teach.y+(camera.work.y-camera. Teach.y);
so working points 1 and 2, we can use the above calculation method to calculate the coordinates of working points 1 and 2 based on base0.
Step 4: After obtaining the teaching point 1 and 2 coordinates based on base0, the working point 1 and 2 coordinates based on base0, and the baseX coordinate value based on base0. (It should be noted that there are 2 baseX points, one is the point where the transfer starts, and the other is the point where the transfer ends).
Step 5: Calculate the rotation angle of the coordinate system.
TeachLineW=Teach2.X-Teach1.X;
TeachLineL=Teach2.Y-Teach2.Y;
TeachAngel=arctan(TeachLineL/TeachLineW);
WorkLineW=Work2.X-Work1.X;
WorkLineL=Work2.Y-Work1.Y
; =arctan(WorkLineL/WorkLineW);
OffsetAngel=TeachAngle-WorkAngle;
So the OffsetAngle we get is the offset angle of our coordinate system.
Step 6: Rotate the coordinate system, and then calculate the offset from the teaching point.
Since we have rotated, the coordinates of the rotated teaching point will change based on base0

insert image description here

Not only the teaching point will change, but also the teaching coordinate system baseX will change, and because it rotates around baseX, the coordinates of the point relative to baseX will not change, but the coordinates relative to base0 will change, so we need to base on The coordinate system under base0 is used for calculation, and the direct output is based on the teaching coordinates under base0.
Derivation process:
It is known that the initial angle is α, the included angle is θ, the length from the point to the coordinate origin is L, and the initial teaching point is XY. Let the rotation teaching point be X'Y'.
It is known that the relationship between the teaching point and the coordinate origin is: L=X/COSα=Y/SINα, and the length of L remains unchanged, then
: X'=L cos(α-θ), Y'=L sin(α-θ)
Then it can be obtained: X'=L
(COSα COSθ+SINα SINθ), Y'=L
(SINα COSθ-SINθ COSα)
can be obtained as: X'=X COS+Y SINθ, Y'=X SINθ-Y COSθ;
Since the counterclockwise rotation in the Kuka robot is positive, and the clockwise rotation is negative, so θ needs to take a negative value.
Therefore, it can be obtained from the derivation to find the coordinates of another point: **
nowTeach1.X=(Teach1.X-BaseX.X)*cosθ-(Teach1.Y-BaseX.Y)*sinθ+BaseX.X;
nowTeach1 .Y=(Teach1.X-BaseX.X)*sinθ-(Teach1.Y-BaseX.Y)*cosθ+BaseX.Y;
then the final offset is:
OffsetX=work1.X-nowTeach1.X;
OffsetY=work1Y-nowTeach1.Y;
Finally, the values ​​we output to the electrical are: OffsetAngel (offset rotation angle), OffsetX, OffsetY.

When the coordinates are abnormal, that is, the coordinate point has been inaccurate, check the solution.
Install the laser. The offset of the coordinate system is often the case of hand-eye grasping or hand-eye positioning, so we install a laser to facilitate subsequent step-by-step verification. The laser is installed on the robot hand, and the position needs to be fixed at all times. Let the laser Laser points can be marked vertically (a level can be used to verify whether the laser is installed horizontally)
1. Determine whether the calibration file is wrong. Teach the current point, shake the robot hand to the No. 1 photo point, and then collect an image, then the robot hand walks X+10 in the base0Tool1 coordinate system, and then takes a photo. Then the robot returns to the No. 2 photo position, and then the robot walks X+10 in the base0Tool1 coordinate system, and takes another photo. Then we can calculate the offset, the offset we output is X=-10, Y=0, A=0. If X is approximately 10, it means that the calibration of the X axis is reversed during calibration (reverse the original calibration file to calibrate once), if there is a deviation greater than 1 in Y, it means that the calibration coordinates and robot coordinates are abnormal during calibration (re-test the robot coordinates), if there is an error of more than 0.2 degrees in A, (you need to determine whether the positioning of the template matching is accurate, and whether it is calibrated in the base0 coordinate system during calibration). In the same way, repeat the above steps to go Y+10 or go XY and +10 at the same time.
2. Whether the coordinate axis of base1 newly created by the manipulator is parallel to the coordinate axis of base0 and in the same direction. When the calibration file verification is normal. Take the robot to the No. 1 photo point, and after taking the photo, move the coordinate system of the robot under Base1 to X+10, and then take a photo, and subtract the calibrated coordinates that have moved 10 from the calibrated coordinates that have not moved. If it is approximately equal to 10 is Correct, if it is -10, the direction of the X coordinate axis of base1 is opposite to that of base. If it is less than 10, and the difference reaches more than 1, then the coordinate axis of base1 is not parallel to the coordinate axis of base0, and the direction of the Y axis is detected in the same way.
3. When there is no problem, recheck whether the point of teach0 is correct. Move the base1 of the robot to the position where XY is equal to 0, then find the position of base1 on the object, draw a cross, turn on the camera to collect in real time, move the robot to the position of the cross, you can clearly see the position of the cross in the camera screen, and Align the center of the image with the cross, and then record the mechanical coordinates under the current base0 as the coordinates of teach0.
4. The program to detect whether the robot moves the coordinate system. Move to an open location, make a mark on the location where the laser point is placed, and then the robot will test and take pictures after teaching, and then add the obtained error value to the coordinate value of base1, rewrite it into the base1 coordinate system, and then the machine Return the hand to the teaching position. If you can roughly walk to the teaching position (when the error is less than 0.5), it means that the robot program is wrong, and you should follow a program based on the offset of the coordinate system (or a program error). If the error is greater than 0.5, there may be an error in the vision program , check whether the accuracy of the program is up to standard, and whether the calculation is normal and correct.

When the baseX coordinate system is established, the coordinates of the camera's teach0 and the robot's base0 under normal circumstances are inconsistent. The reason is that when using the 3-point method to establish the robot BaseX, a certain point on the fixture is usually selected as the reference point for the robot to establish BaseX. The teach0 captured by the camera is not at the same point as the robot. The result is that the coordinates of teach0 are different from the coordinates of the baseX origin at base0, which can be divided into BaseX under the camera and BaseX under the robot. If you directly select the origin of baseX as teach0, there will be an error due to the certain angle and distance between the camera and the baseX point of the robot. When we use the camera to align the baseX point, we actually coincide the camera's BaseX with the robot's BaseX, which can compensate for the error caused by the position difference between the camera and the robot.

Guess you like

Origin blog.csdn.net/m0_51559565/article/details/128288344