Planar-Target-Based Structured Light Calibration Method for Flexible Large-Scale 3D Vision Measurement

The calibration of a structured light plane is the key technique in structured light 3D vision measurement. In this paper, a novel method of structured light plane calibration is presented. A simple 2D planar target is used, which can be freely moved to different positions in the visible space of a camera when the target and the structured light plane intersect with each other. The corresponding world coordinate is set for each target in different positions. With data processing, the equations of intersecting lines obtained at different positions are unified at the camera coordinate system automatically. The least squares method is used to fit the equation of the structured light plane. An indoor experiment under a large-scale measurement situation has been performed. The experimental results show that the proposed method is simple, universal, and of high efficiency.


Introduction
Structured light 3D vision measurement technology can achieve fast and noncontact measurement of 3D morphology characteristics. It is widely used in reverse engineering and online detection. The calibration of the structured light plane is the key technique in structured light 3D vision measurement, which has been studied by many scholars. Liu et al. (1) proposed a calibration method based on the tooth shape target and onedimensional table. Zhou and Zhang (2) introduced a calibration method that takes two cameras and a planar target which is placed along the structured light plane. However, these methods require special equipment such as a special calibration target or an extra camera, and an inefficient and complex procedure that is not suitable for field calibration. Xu and Zhang, (3) Sun et al., (4) and Zhang (5) proposed calibration methods based on cross ratio invariability. These two methods only need to freely move a 2D planar target to different positions in the visible space of the camera, which is much more suitable for infield application than the previous methods. However, the data processing procedure of these two methods is still of low efficiency.
In this paper, a novel method of structured light plane calibration is proposed. A simple 2D planar target is used. The target can be freely moved to different positions in the visible space of the camera if only the target and the structured light plane can intersect with each other. The corresponding world coordinate system is set for each target in different positions as well as the transformation relationships between world coordinates, camera coordinates, and image coordinate systems. With data processing, the equations of intersecting lines obtained at different positions are unified at the camera coordinate system automatically. The least squares method is used to fit the structured light plane's equation.

Structured Light System Model
In the structured light 3D vision measurement system, the perspective imaging process can often be approximated as a pinhole imaging model. (6) A schematic of the pinhole perspective transformation for the camera is shown in Fig. 1. The world coordinate system o w x w y w z w is set as a vertex of the target to be the origin point. The plane o w x w y w coincides with the target plane, and the axis o w z w is perpendicular to the target plane. The origin point o c of the camera coordinate system o c x c y c z c is the optical main point of the camera. The ideal image coordinate system o I x I y I is set as the intersection point of the axis o c z c and the image plane to be the origin point o I . Because in the computer the origin point of an image is usually its vertex, the computer image coordinate system o 0 u 0 v 0 is set as the top-left vertex of the image to be the origin point o 0 .
Take line structured light for example; the structured light plane projected by the light source intersects the target in the line L. P w is a point on the line L, which has the coordinates P w (x w , y w , z w ) in the world coordinate system o w x w y w z w , P c (x c , y c , z c ) in the camera coordinate system ocxcyczc, P I (x I , y I ) in the ideal image coordinate system o I x I y I , and P 0 (x 0 , y 0 ) in the computer image coordinate system o 0 u 0 v 0 . The conversion of the world coordinate system and computer image coordinate system is as follows: where s is the scale factor and the matrix [R T] contains all the external parameters. R and T denote the relationships of rotation and displacement between the world coordinate system and camera coordinate system, respectively. A stands for the matrix of internal parameters while (u 0 , v 0 ) is the coordinate of the point o I in the computer image coordinate system, which is the origin point of the ideal image coordinate system. The term α (α = ƒ/dx) is the normalized focal length of the camera in the x-direction, where ƒ is the focal length of the camera and dx is the physical size of each pixel in the x-direction. Likewise, the term β (β = ƒ/dy) is the normalized focal length of the camera in the y-direction, where ƒ is the focal length of the camera and dy is the physical size of each pixel in the y-direction. The term c is the nonperpendicular factor between the x-axis and y-axis of the camera, which depends on the structure of the camera's lightsensitive devices.
Considering the coordinate systems created above, the coordinate z w of the target in the world coordinate system is zero. Let r i be the i-th column of the rotation matrix R, the translation between the world coordinate system and computer image coordinate system can be simplified as follows: The 3×3 matrix H is called the homography matrix from the world coordinate system to the computer image coordinate system.

Calibration of Structured Light Plane
To calibrate the parameters of the structured light plane, it is necessary to solve three problems: the solution of the homography matrix H, the restoration of image distortion, and the extraction of the feature point's coordinates.

Solution of the homography matrix H
Since the homography matrix H has eight degrees of freedom, the matrix H with the scale factors can be solved by eq. (2), as long as more than four non-collinear corresponding points are obtained. Make the definition as follows: According to the solution of eq. (3) described in eq. (4), the coefficients of the matrix A can be solved from the matrix B. However, the result of eq. (4) is often less than ideal in actual situations because of the noise. Reference 7 gives the optimization method, which takes the linear result of eq. (4) as the initial value and nonlinearly optimizes the coefficients of the matrix A by the Levenberg-Marquardt algorithm.
Once the internal parameter matrix A has been solved, the external parameter matrix [R T] can be decomposed from the matrix H. Let h i be the i-th column of the matrix R, the terms of the external parameter matrix are given by eq. (5). Owing to the effect of noise, the rotation matrix generally does not meet the constraint of the orthogonal properties. However, it can be orthogonalized by singular-value decomposition.

Image distortion model
The models above are all approximate when the ideal image point coincides with the actual image point. In the real world, a difference exists between the ideal image point and the actual image point owing to the optical distortion of the camera lens. To improve the measurement accuracy, it is necessary to introduce the distortion model of the camera lens. There are three main kinds of distortion: radial distortion, bias distortion and thin prism distortion. (8) Radial distortion is mainly caused by the defects of the lens. Bias distortion is mainly caused by the inconsistency between the optical center and the geometric center of the optical system. Thin prism distortion is caused by the defects of the lens design and the installation error. Owing to the very high accuracy of the optical system design, processing, and installation, the thin prism distortion is currently very small for standard industry cameras and can be ignored. Meanwhile, for non-extremely high precision measurements, the tangential distortion caused by bias distortion can also be ignored. Thus, we only need to consider radial distortion in general applications, of which only the first two nonlinear coefficients need to be calculated. Complex distortion models not only fail to improve the measurement accuracy, but may also produce numerical instability. (7) Therefore, the practical distortion correction model is as follows: x 0 ′ = x 0 + (x 0 − u 0 )[k 1 (x I 2 + y I 2 ) + k 2 (x I 2 + y I 2 ) 2 ], where (x 0 , y 0 ) is the ideal image point coordinate in the computer image coordinate system and (x 0 ′, y 0 ′) is the image point coordinate obtained by actual observation. (x I 2 + y I 2 ) is the square of the polar radius from the image point to the center. The terms k 1 and k 2 are the radial distortion coefficients. The detailed method of solving the radial distortion coefficient is discussed in ref. 7.

Recovery method for image distortion
By image processing, we can obtain the coordinate of a point on the intersection line L in the computer image coordinate system. Furthermore, we can obtain the coordinate P w (x w , y w , 0) of the point in the world coordinate system, with x w and y w obtained by eq. (8) and the constraint z w = 0.
For a single target image, almost all the points on the intersection line L can be retained as feature points, so we can obtain hundreds to thousands of feature points with a single image. Therefore, we can greatly improve the calibration efficiency. To obtain the equation of the structured light plane, the target needs to be placed on at least two different locations. Solving the equation, we can obtain the noncollinear points. For the feature points obtained in each location, we can always unify their coordinates from world coordinate system to the computer image coordinate system by eq. (9). Then the least squares method is used to fit the structured light plane's equation.

Calibration for rotating light plane
Usually, the structured light plane needs to be scanned by a motor stage to measure the object's 3D geometric, as shown in Fig. 2. The parameters for the light plane change at each position. Therefore, if we use the method described above to calibrate the parameters of the structured light plane, we must calibrate the equation of the corresponding structured light plane for every rotary position of the motor stage. At the same time, we need to ensure that the repeatability of the motor platform's rotation angle is good.
An alternative method is to calibrate the rotation axis of the motor stage in addition to the existing calibration method. Equations for several structured light planes can be obtained by calibration. The intersection line of these planes is the rotation axis of the stage. The intersection line is easily obtained by fitting. With this method we only need to accurately measure the offset between the current and initial angle when the structured light plane rotates. Then, we can obtain the equation of the structured light plane according to the precalibrated rotation.

Calibration of structured light plane
The experimental setup is shown in Fig. 2, which uses the STC-CLC152A-type complementary metal oxide semiconductor (CMOS) camera of Sensor Technologies, a 5-100 mm zoom lens of Guilin Optical Instrument Factory, and a 650 nm 10 W line structured laser. A 16×12 grid board-like plane target is used in the system, as shown in Fig. 3. The spacing between grids is 10×10 mm 2 . During calibration, the target can be freely moved to different positions in the visible space of the camera only if the target and the structured light plane can intersect with each other, as shown in Fig. 4.
To facilitate subsequent image processing, the camera respectively shoots images of the target when the structured light source is off and on, as shown in Figs. 3 and 4. After Fig. 3 is subtracted from Fig. 4, we obtain the image of the light line to be processed, as shown in Fig. 5. Using the line extraction algorithm described in ref. 9, we can obtain the sub-pixel coordinates of the line structured light's center. By locking the focal length of the lens, the internal parameters and radial distortion coefficients of the camera can be calibrated by the method described in ref. 7. The calibration result is shown in Table 1.
By moving the target to different positions, we obtain 1530 feature points on three different lines in the camera coordinate system. The equation of the structured light plane is obtained as eq. (10), by fitting these feature points.
32x − 2y − 18z + 10000 = 0 (10) Figure 6 shows the relationship between the feature points and the fitting plane, where dark points are the measured feature points and the plane made up of gray points is the light plane.
To verify the accuracy of calibration results, the simultaneous equations of the target plane and the fitting light plane are presented, by which we can obtain the equation of the intersection line for the target and the light plane. We compare the contrast results of the measured point coordinates and the coordinates of the points on the intersection line and the root mean square error is 0.025 mm.

Calibration of rotation axis
As mentioned above, calibrate the corresponding rotation axis of the light planes when the motor platform rotates, we need equations of the structured light plane at at least two different positions. Using the calibration method described above, we can obtain these three structured light plane's equations: 0.801467x + 0.135193y − z − 326.112512 = 0, 0.510527x + 0.135689y − z − 36.414167 = 0.
(13) Fig. 3 (left). Target when the structured light source is off. Fig. 4 (middle). Target when the structured light source is on. Fig. 5 (right). Image of the light line after Fig. 3 is subtracted from Fig. 4. Table 1 Calibration results of the internal parameters and radial distortion coefficients.
Parameter Value α 1359.9 (pixels) β 1357.4 (pixels) c 0 u 0 533.9 (pixels) v 0 394.4 (pixels) k 1 −0.09732 k 2 0.41023 Equation (14) stands for the corresponding rotation axis. The fitting residue is 0.012 pixels. Figure 7 shows the position relationship between the structured light planes and the rotation axis. The intersection line of the three light planes is considered as the rotation axis. We can see in Fig. 7 that the three light planes intersect with each other just right in a single line, which also indicates the high calibration precision of these structured light planes.

Conclusions
In this paper, a novel method of structured light plane calibration is presented, which only uses a simple 2D target. The target can be freely moved to different positions in the visible space of a camera when the target and the structured light plane intersect with each other. The residue of the calibrated light plane is only 0.025 mm when the observing distance is approximately 500 mm. The calibration result showed that the method is simple, universal, and of high efficiency.