A Space Automatic Welding System for Space Position Measurement

Using image capturing techniques, numerical calculations, and laser measurements, an optical measurement system for welding location and distance was constructed to replace the current need to rely on human judgment. This system combines an optical measurement system, a clamping workbench for workpieces, and a device with two sets of welding guns equipped with two planar biaxial translation devices, each consisting of two screws and step motors. Two set clamping mechanisms on the arm are used to fix the laser range finders and image capturing devices. In this study, the image measurement system responsible for obtaining the location and distance for the welding bead is a laser range finder, which integrates position information to determine the location of the welding torch. All of the movement information is eventually sent to the controller to adjust the welding gun and cooperate with the cylinder holder platform for the automatic welding process, providing significantly improved production efficiency. Additionally, this system is functionally able to measure lines, outer and inner circles, angles, distances, and relative heights, and can send the data to the torch to carry out the welding process.


Introduction
Manufacturing procedures should consider working precision and inspection accuracy. To reduce the demand for personnel in manufacturing systems, the policy in modern factories is to use many contact and noncontact automatic devices. Noncontact devices do not influence the workpiece, but are often used to detect defective or fragile surfaces.
Automated optical inspection (AOI) is the major technology applied in noncontact investigating devices. A light is shone on a body and its reflection captured by a camera. This method resembles a human inspecting the body by sight; therefore, image measurement can be applied to various industrial fields for tasks such as workpiece measurement, identification, and testing.
In recent years, machine vision has been applied to detection, measurement, recognition, and positioning. A machine using a visual inspection system can measure the diameter and edge of a microdrill; in one study, after testing a variety of drills, the system was demonstrated to be capable of measuring microdrill power and stability. (1) By acquiring data at the closest and furthest points of a workpiece from a camera, which can be achieved by setting a threshold value using nonuniformity correction between the two points to handle the data, the displacement can be determined using a correlation function. (2) One study adopted the vertical moving path of a CCD camera and integrated a revised similar triangle algorithm to establish the partial precision edge line of a denture. (3) In Ref. 4, responding to the necessity for an automated tool monitoring system in an unmanned machine environment, a machine vision system that employed a statistical filtering method to examine the wear area of a tool was set up. Another study put a drill on a rotary platform that acted in the plane of motion; the image of the drill's gauge edge was then acquired by a CCD camera, and the wear of the gauge edge could be estimated using an image processing algorithm. (5) A machine vision system was used in Ref. 6 to automatically examine the surface defects of thin metal wires (diameter, 50-2000 μm) and to verify the results through electron microscopy. Machine vision and laser measurement technology are commonly included in AOI systems. Laser ranging technology uses laser beams to determine the distance from a target. In Ref. 7, on a large scale, a laser and machine vision were utilized to measure the straightness of objects, and the scattering angle of the laser beam was increased in Ref. 8 to measure the roughness of surfaces. In Ref. 9, an airborne laser was used to outline landforms by measuring distances. Another application used an AOI system in visual servoing for a driverless automobile. (10) Additionally, an AOI system combined with force control was proposed in Ref. 11 to achieve tracking control in robotics, although this study discussed simulations only.
In manual machining programs, the correctness of the machining path and the yield rate of the product are affected by human imprecision. Most of the research covered in the literature focuses on two dimensions; however, manual operation tends to involve three dimensions. Therefore, this study replaced manual operation with a spatial tracking system for reducing the faults introduced by the human factor. The AOI system used machine vision for edge finding and a laser range finder to determine distance. This intelligent tracking system concept was applied to an automatic welding system for verification.

Image processing
The data for a color image captured by a CCD camera can be transformed into a twodimensional grayscale function, f(x,y). The (x,y) values represent spatial coordinates of every pixel, and the function f is the value of each coordinate. The transformation specification corresponds to the International Telecommunication Union standards. Notably, when pixels are expressed in grayscale, the value of any pixel will be from 0 to 255; the higher the value, the brighter the pixel.
Because the quality of an image can be affected by uneven light, insufficient contrast, or noise from the surroundings or the camera system, image filtering was employed for noise removal to improve the contrast. In this study, image filtering was divided into spatial and frequency domains. The method adopted for the spatial domain addressed every pixel in the image and can be represented by the function g( y) is the output image, and T is an operator defined for a neighborhood of (x,y). The operator mostly employed a 3 × 3 square image mask acting on every (x,y) to generate g(x,y). Although a low-pass filter could strengthen the portion that changes gradually and suppress the portion that changes rapidly, this often blurs the image. Similarly, a high-pass filter could highlight the high-frequency portion by increasing the value of its pixels and decreasing the values of other pixels in its neighborhood, but this often increases image noise. In contrast, a median filter replaces the grayscale of every pixel used by the middle value, not the mean value of the 3 × 3 mask; therefore, this method is less likely to produce a vague outline and can eliminate unusually large or small outliers. Consequently, this study used a median filter to process grayscale images.
Binarization is a valuable method of image segmentation, and the processing results are able to distinguish between the background and the part being measured. Binarization can be expressed as: The threshold value K could be set artificially, but we adopted a statistical threshold method for determining the value of K. (12)

Laser ranging
The theory of laser range finding can be divided into a time of flight measurement, a phase difference measurement, and triangulation. In this study, a triangulation method was used to measure distance. The theory of optical triangulation estimates the distance between a light source and an analyte by triangulating between the projection points of the analyte and detector.

Geometric measurement
Image processing provides a relatively clear image, and the pixel is a basic unit of an image, which is represented by coordinates. In manufacturing procedures, regression equations must be integrated to restore the original size of the workpiece; therefore, this study utilized coordinate values and regression equations to reconstruct the machining path. Specifically, the least squares regression line, least squares arc, intersection of two lines, intersection of the line and arc, intersection of two arcs, distance between two points, and perpendicular distance of point to a line were used. For example, after the coordinates of several points (x i ,y i ) are obtained through image processing, the equation of a line can be expressed as: where a and b are coefficients of the equation, and the error can be defined as: The sum of squares of the error can be expressed as: To minimize the objective function, a partial differential to the coefficients was calculated and set to zero. The result could be rearranged into the following matrix: The coordinate (x i ,y i ) could be substituted into Eq. (5), using Cramer's rule to determine the coefficients (a and b) and then substituting them into Eq. (2) to get the equation of the line. This least squares regression line method was adopted in this work.

Unit transformation
The pixel is the unit of image-based measurement, but this must be converted into metric units of length for fabrication. In this study, the conversion was based on a gauge block. When the magnification and focal length of the CCD camera remain unchanged, Eq. (6) can be used to obtain the coefficient of the image unit converted to length to fix the distance between the gauge block and camera lens: where ω 1 is the coefficient of the image unit converted to length (mm/pixel), t gauge block is the actual size of the gauge block, and p image is the pixel size of the gauge block. In this study, the verification collocated a moving plane with four axes, because a step motor was used as the driving device for every axis. Hence, Eq. (7) was also used to convert the coordinates of the image into data for propelling the step motor: where ω 2 is the coefficient of the image coordinate converted to the data for the driving step motor (pixel/ pulse), D mm is the screw lead, and d pulse is the resolution of the step motor (pulse numbers/circle). These system operations are summarized in Fig. 1.

Structure of the measurement system
The experimental architecture of this work includes: a CCD camera (Guppy series, Allied Vision Technologies) for capturing images and communicating data to a computer via an IEEE 1394a interface; a laser range finder (RF603 laser triangulation sensors, Riftek Ltd.) for measuring distance and communicating data to a computer via a USB; a moving platform with step motors (200 pulse/rev) to drive four axes; a step motor controller (Model PMX-4CX-SA, Arcus Technology) for controlling the step motors; and a measurement program designed by the authors in Visual Basic. All of the instruments were fabricated and their efficacy was verified by a welding system for thermal storage barrels. The front and side views of this welding system are depicted in Fig. 2.

Verification and Analysis
To verify the accuracy of positioning the moving platform, a laser interferometer (including an XL-80 laser measurement system and XC-80 environmental compensation unit, Ranishaw) was used to measure the error along each axis. The results for the x and y axes are shown in Figs. 3 and 4. According to the data, as the distance was increased along the platform, the positioning error of the x-and y-axes appeared to be a linear approximation. Possible factors include missteps of the step motor or the backlash of the screw. From calculations, the x-and y-axis compensations should be 1.5 × 10 −4 and 6.52 × 10 −4 mm, respectively, for each millimeter. The angle and displacement error must be verified for the rotation axis. Schematics are shown in Figs. 5 and 6, and the angle    To confirm the repeatability of the measurement, gauge blocks of three different sizes (4, 7, and 9 mm) were placed on the cylinder, which was then rotated and reinstated. The experimental standard deviations for the 4, 7, and 9 mm gauge blocks were 0.00397, 0.0032, and 0.0051 mm, respectively. However, the 4 mm gauge block was again placed on the cylinder, which was then rotated one revolution and reinstated to check the repeatability of the distance measurement. The testing was done three times, and each time the cylinder was rotated for ten revolutions. The experimental standard deviation was 0.0027 mm the first time, 0.0029 mm the second time, and 0.0017 mm the third time.

Results and Discussion
In this study, a spatial tracking system was constructed using image processing technology and a laser range finder to collocate the measurement program constructed by the authors. To verify the tracking performance, its repeatability and accuracy were tested. The results of this study are as follows: A spatial system was constructed for path tracking and machining. The system acquired the coordinates of the spatial path, which were used to establish the machining path, and the parameters of the workpiece were compared before and after machining.
In the repeatability analysis, the errors in the height repeatability function and the relative height repeatability function were both under 6 μm, the error of the machine depth repeatability function was under 5 μm, and the error of the image repeatability function was under 3 μm.
The proposed intelligent tracking system was applied to a welding system for thermal storage barrels. To confirm the effectiveness of the system, 10 L of water was poured into the barrels for leak testing. The experimental welding work was divided into two groups: manual operation and the proposed system. Both groups carried out three series of welding operations, and in each series 200 barrel welding operations were performed. The results of the leakage test are presented in Table 1.

Conclusion
In the evolution of automated processes, sensors have always played a vital role. As technology advances, the development of sensors to measure various physical quantities enhances the degree of automation. Compared with human senses, replicating the optical sense is perhaps the last piece of the puzzle in simulating human perception, which is desirable because human vision cannot work for long hours and remain effective. However, if the information captured by an optical sensor is processed appropriately, it may suitably substitute for human vision, with the benefit that the sensor would be capable of working long hours without wearing out. Several studies reviewed earlier in  Third series 5 2.5% 1 0.5% this paper (e.g., measuring the diameter of a microdrill edge, monitoring the wear of tools in an unmanned machine environment, and examining surface defects of thin metal wires automatically) confirm that the information captured by an image sensor can effectively replace human vision. With the addition of optical ranging, optical sensing has entered the three-dimension world. Other references cited in this study (e.g., measuring the straightness of objects in larger spaces, measuring the roughness of surfaces, creating driverless automobiles, and measuring distances to outline landforms) also substantiate the feasibility of optical sensing.
This research fuses a two-dimensional plane image with a depth measurement to construct an intelligent space tracking system. An AOI system was developed to track a welding bead during welding, and after testing, the results indicated that the functioning of this system is superior to manual operation.
Although we conclude that this is an effective tracking system for welding processes, it could still be improved: 1. To avoid errors from the movement of the mechanism, a servo motor could enhance precision by preventing missteps of the step motor in the image capture and tracking stages. 2. To reduce errors in image measurement, a telecentric lens should be used to avoid edge distortion. 3. Manual focusing is prone to errors in image conversion; therefore, a high resolution camera and automatic focusing should be considered in future research.