University of Twente Student Theses
Integration of a vision system with a MOTOMAN HP3 robot system at the Universidad Autónoma de San Luis Potosí
Schipper, E. (2014) Integration of a vision system with a MOTOMAN HP3 robot system at the Universidad Autónoma de San Luis Potosí.
PDF
1MB |
Abstract: | This is a summary of the internship of Erald Schipper performed at the Universidad Autónoma de San Luis Potosí (UASLP). The main goal of the internship is to apply a developed vision algorithm in a working application using a MOTOMAN HP3 six degree of freedom robot. A vision algorithm has been developed in former research which determines the location of a point or multiple points on a taken snapshot. The internship has been divided in three parts. In the first part the robot is tested for its capabilities. Secondly, the way of connecting two different systems on different computers is examined. The first is controlling the camera and the vision algorithm in Matlab and acts as the slave system. The second system and computer controls the robot using a library in C++ and this system acts as the master, which controls the slave. Thirdly, the found connection and robot capabilities are combined in a test to run the robot from point to point using an iterative learning process. This way the points on the paper can be found with high accuracy and are used as waypoints for the robot. Also a calibration method for the system has been developed. The first parts results in the following conclusions. The robot can be moved from point to point in three different control groups. A control group is a group of axis that can be controlled. The control group defines the location of the Tool Control Point (TCP) and the rotation of the used coordinate system. The first one is “base”, meaning the movements are done with respect to a fixed base in the space. The second one is “robot”, meaning the origin of the system moves with the robot. This is relevant in case the robot base is mounted on a moving platform. In this case the robot is fixed to the world, which means that robot and base have the same fixed point in space. The third one is “station” but this one is not used in the system used by the UASLP In this case the origin of the system lies on a specified work station. The control group used is the one called “base”. The robot can be moved using four coordinate systems. The first is joint coordinates meaning that the joints of the robot can be moved separately. The second is cartesian coordinates meaning that the end effector moves in X, Y and Z direction and can be rotated around the TCP. Al systems but the joint system use cartesian coordinate systems but are otherwise defined. Therefore this system is specifically called cartesian coordinates. The third is user coordinates which puts the point (0,0,0) at a defined point in space and rotates the axis of the around some user defined angle around the TCP. Tool coordinates is the last system and this system moves the robot with respect to the TCP at the tip of the robot. The rotations are again rotations of the axis of the coordinate system around the TCP. The TCP can be altered using the software called High Speed Job Exchanger. The exact use of the different control groups in combination with the coordinate systems has not become clear because only one control group was tested with the cartesian en joint coordinate systems. There are three ways of moving the robot: by increment in mm or rotation of the axis, with specified pulses for every joint, or by specifying the coordinates of the final position in millimetres and rotation of the axis in degrees. Joint movement moves the robot in a kinematical favourable way for the robot and linear movement moves the robot with constant speed in a linear manner to the next point. The robot can be controlled using the Motocom32 library. Its functions can be found in the file MOTOCOM32_US.pdf The connection of the slave and the master is done by making use of sockets. The slave waits until the master opens a socket. There are several functions developed in the master program. It Internship Erald Schipper at the Universidad Autónoma de San Luis Potosí 2 can open and close a connection which has to be established for every separate operation. The master program can send a trigger to the slave to do something, can receive a trigger to do something and the programs can send and receive data. The slave can open a connection using the TCP/IP command in Matlab. This connection is used to send and receive the data and triggers. The obtained knowledge is tested by finding several drawn points on a piece of paper. The used method is that of iterative learning. The robot runs an initial path with the TCP within a range of 8 mm of the drawn spot. At every point a snapshot is taken and the slave calculates the deviation in millimetres from the spot to the TCP. This correction is then being sent to the master which constructs a new path for the next run. This procedure can be repeated to iterate to an improved path. At every run the error ε decreases. No further decrease of ε has been seen after 3 iterations. The ε found is not larger than 16μm. Before the test can be run, the system is calibrated. This calibration is done by moving the robot to a start position in which a single point shows in the camera image. Next, the robot is moved to a square matrix of positions around that point. The square matrix is 20x20 mm2 and consists of 9 positions. That way the conversion between pixels and millimetres is done. The robot moves a specified amount of times to the point, each time from another direction in a circle around the point. This way the influence of robot kinematics is brought to a minimum and more statistical values are available. Also the influence of distortion due to the lens can be seen in the results. Deviation from the optical centre with 10mm results a maximum of 0.2mm. Because of the iteration steps the found points move to the optical centre resulting in minimum errors due to the distortion. A manual for the software has been written and can be found in the full report. Also guidelines about how to cope with errors and bugs can be found. In all circumstances should the user be careful on how to use the software as sudden moves can damage the camera or the system. |
Item Type: | Internship Report (Master) |
Clients: | Universidad Autónoma de San Luis Potosí, Mexico |
Faculty: | ET: Engineering Technology |
Subject: | 52 mechanical engineering |
Programme: | Mechanical Engineering MSc (60439) |
Keywords: | Vision, System Integration, Robot, Iterative Learning |
Link to this item: | https://purl.utwente.nl/essays/69282 |
Export this item as: | BibTeX EndNote HTML Citation Reference Manager |
Repository Staff Only: item control page