• Laser & Optoelectronics Progress
  • Vol. 62, Issue 8, 0828002 (2025)
Tingya Liang1,2,3,*, Xie Han1,2,3, and Haoyang Li1,2,3
Author Affiliations
  • 1School of Computer Science and Technology, North University of China, Taiyuan 030051, Shanxi , China
  • 2Shanxi Provincial Key Laboratory of Machine Vision and Virtual Reality, Taiyuan 030051, Shanxi , China
  • 3Shanxi Province's Vision Information Processing and Intelligent Robot Engineering Research Center, Taiyuan 030051, Shanxi , China
  • show less
    DOI: 10.3788/LOP241885 Cite this Article Set citation alerts
    Tingya Liang, Xie Han, Haoyang Li. Multi-View Three-Dimensional Reconstruction of Weak Texture Regions Based on Simple Lidar[J]. Laser & Optoelectronics Progress, 2025, 62(8): 0828002 Copy Citation Text show less

    Abstract

    To address the issues of holes and regional deficiencies in weakly textured areas during multi-view stereo (MVS) three-dimensional (3D) reconstruction, this paper proposes a complementary 3D reconstruction method that fuses data from MVS and lidar. First, the GeoMVSNet deep learning network is used to process multi-view images captured by a smartphone camera, resulting in MVS depth maps. Next, the smartphone camera and lidar are calibrated to compute the internal and external parameters and the coordinate system transformation matrix. Through transformations of temporal and spatial consistency, viewing angles, and scale, the sparse point cloud collected by lidar is converted to the image perspective. In addition, a depth map enhancement algorithm is proposed and applied to generate dense point cloud data from the lidar. Finally, the dense depth maps generated by lidar and the MVS depth maps are fused. Experimental results show that the reconstruction quality is significantly improved in weakly textured areas on a self-built dataset when the proposed method is used, enhancing the accuracy and completeness of 3D reconstruction. Thus, this study provides an effective solution for 3D reconstruction in weakly textured regions.