S&M2679 Research Paper of Special Issue
Published: September 16, 2021
Robust 3D Model Reconstruction Based on Continuous Point Cloud for Autonomous Vehicles [PDF]
Hongwei Gao, Jiahui Yu, Jian Sun, Wei Yang, Yueqiu Jiang, Lei Zhu, and Zhaojie Ju
(Received December 22, 2020; Accepted August 26, 2021)
Keywords: dense 3D point cloud, region growing, match optimization, monocular zoom stereo vision
Continuous point cloud stitching can reconstruct a 3D model and play an essential role in autonomous vehicles. However, most existing methods are based on binocular stereo vision, which increases space and material costs, and these systems also achieve poor matching accuracies and speeds. In this paper, a novel point cloud stitching method based on the monocular vision system is proposed to solve these problems. First, the calibration and parameter acquisition based on monocular vision are presented. Next, the region-growing algorithm in sparse matching and dense matching is redesigned to improve the matching density. Finally, an Iterative Closest Point (ICP)-based splicing method is proposed for monocular zoom stereo vision. The point cloud data are spliced by introducing the rotation matrix and translation factor obtained in the matching process. In the experiments, the proposed method is evaluated on two datasets: self-collected and public datasets. The results show that the proposed method achieves a higher matching accuracy than the binocular-based systems, and it also outperforms other recent approaches. In addition, the 3D model generated using this method has a wider viewing angle, a more precise outline, and more distinct layers than the state-of-the-art algorithms.
Corresponding author: Jiahui Yu, Zhaojie Ju
This work is licensed under a Creative Commons Attribution 4.0 International License.
Cite this article
Hongwei Gao, Jiahui Yu, Jian Sun, Wei Yang, Yueqiu Jiang, Lei Zhu, and Zhaojie Ju, Robust 3D Model Reconstruction Based on Continuous Point Cloud for Autonomous Vehicles, Sens. Mater., Vol. 33, No. 9, 2021, p. 3169-3186.