• Chinese Journal of Lasers
  • Vol. 52, Issue 6, 0600003 (2025)
Kai Huang1, Junqiao Zhao2,*, and Tiantian Feng1
Author Affiliations
  • 1College of Surveying and Geo-Informatics, Tongji University, Shanghai 200092, China
  • 2School of Computer Science and Technology, Tongji University, Shanghai 200092, China
  • show less
    DOI: 10.3788/CJL241023 Cite this Article Set citation alerts
    Kai Huang, Junqiao Zhao, Tiantian Feng. Local Geometric Information Representation and Uncertainty Analysis in LiDAR SLAM[J]. Chinese Journal of Lasers, 2025, 52(6): 0600003 Copy Citation Text show less
    References

    [1] Li D R, Li M. Research advance and application prospect of unmanned aerial vehicle remote sensing system[J]. Geomatics and Information Science of Wuhan University, 39, 505-513, 540(2014).

    [2] Geiger A, Lenz P, Urtasun R. Are we ready for autonomous driving? The KITTI vision benchmark suite[C], 3354-3361(2012).

    [3] Geiger A, Lenz P, Stiller C et al. Vision meets robotics: the KITTI dataset[J]. The International Journal of Robotics Research, 32, 1231-1237(2013).

    [4] Zhang Y J. Geometric processing of low altitude remote sensing images captured by unmanned airship[J]. Geomatics and Information Science of Wuhan University, 34, 284-288(2009).

    [5] Zhu N, Marais J, Bétaille D et al. GNSS position integrity in urban environments: a review of literature[J]. IEEE Transactions on Intelligent Transportation Systems, 19, 2762-2778(2018).

    [6] Cadena C, Carlone L, Carrillo H et al. Past, present, and future of simultaneous localization and mapping: toward the robust-perception age[J]. IEEE Transactions on Robotics, 32, 1309-1332(2016).

    [7] Macario Barros A, Michel M, Moline Y et al. A comprehensive survey of visual SLAM algorithms[J]. Robotics, 11, 24(2022).

    [8] Bujanca M, Shi X S, Spear M et al. Robust SLAM systems: are we there yet?[C]. Czech Republic, 5320-5327(2021).

    [9] Bloesch M, Burri M, Omari S et al. Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback[J]. The International Journal of Robotics Research, 36, 1053-1072(2017).

    [10] Forster C, Carlone L, Dellaert F et al. IMU preintegration on manifold for efficient visual-inertial maximum-a-posteriori estimation[C], 13-17(2015).

    [11] Forster C, Carlone L, Dellaert F et al. On-manifold preintegration for real-time visual: inertial odometry[J]. IEEE Transactions on Robotics, 33, 1-21(2017).

    [12] Younes G, Asmar D, Shammas E. A survey on non-filter-based monocular visual SLAM systems[EB/OL]. https:∥arxiv.org/abs/1607.00470v1

    [13] Gao X, Zhang T, Liu Y et al[M]. Fourteen lectures on visual SLAM: from theory to practice(2017).

    [14] Taketomi T, Uchiyama H, Ikeda S. Visual SLAM algorithms: a survey from 2010 to 2016[J]. IPSJ Transactions on Computer Vision and Applications, 9, 16(2017).

    [15] Mur-Artal R, Montiel J M M, Tardós J D. ORB-SLAM: a versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics, 31, 1147-1163(2015).

    [16] Mur-Artal R, Tardós J D. ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras[J]. IEEE Transactions on Robotics, 33, 1255-1262(2017).

    [17] Campos C, Elvira R, Rodríguez J J G et al. ORB-SLAM3: an accurate open-source library for visual, visual–inertial, and multimap SLAM[J]. IEEE Transactions on Robotics, 37, 1874-1890(2021).

    [18] Qin T, Li P L, Shen S J. VINS-mono: a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 34, 1004-1020(2018).

    [19] Geneva P, Eckenhoff K, Lee W et al. OpenVINS: a research platform for visual-inertial estimation[C], 4666-4672(2020).

    [20] Solà J. Quaternion kinematics for the error-state Kalman filter[EB/OL]. https:∥arxiv.org/abs/1711. 02508v1

    [21] Rodríguez-Arévalo M L, Neira J, Castellanos J A. On the importance of uncertainty representation in active SLAM[J]. IEEE Transactions on Robotics, 34, 829-834(2018).

    [22] Eade E. Lie groups for 2D and 3D transformations[EB/OL]. https:∥ethaneade.org/lie.pdf

    [23] Rusu R B, Cousins S. 3D is here: point cloud library (PCL)[C], 9-13(2011).

    [24] Ye H Y, Chen Y Y, Liu M. Tightly coupled 3D lidar inertial odometry and mapping[C], 3144-3150(2019).

    [25] Shan T X, Englot B, Meyers D et al. LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping[C], 5135-5142(2021).

    [26] Qin C, Ye H Y, Pranata C E et al. LINS: a lidar-inertial state estimator for robust and efficient navigation[C], 8899-8906(2020).

    [27] Chen K, Lopez B T, Agha-mohammadi A A et al. Direct LiDAR odometry: fast localization with dense point clouds[J]. IEEE Robotics and Automation Letters, 7, 2000-2007(2022).

    [28] Setterfield T P, Hewitt R A, Espinoza A T et al. Feature-based scanning LiDAR-inertial odometry using factor graph optimization[J]. IEEE Robotics and Automation Letters, 8, 3374-3381(2023).

    [29] Besl P J, McKay N D. A method for registration of 3-D shapes[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14, 239-256(1992).

    [30] Rusinkiewicz S, Levoy M. Efficient variants of the ICP algorithm[C], 145-152(1).

    [31] Pomerleau F, Colas F, Siegwart R et al. Comparing ICP variants on real-world data sets[J]. Autonomous Robots, 34, 133-148(2013).

    [32] Biber P, Strasser W. The normal distributions transform: a new approach to laser scan matching[C], 2743-2748(2003).

    [33] Zhou B, Tang Z Q, Qian K et al. A LiDAR odometry for outdoor mobile robots using NDT based scan matching in GPS-denied environments[C], 1230-1235(2017).

    [34] Schulz C, Zell A. Real-time graph-based SLAM with occupancy normal distributions transforms[C], 3106-3111(2020).

    [35] Srinara S, Lee C M, Tsai S et al. Performance analysis of 3D NDT scan matching for autonomous vehicles using INS/GNSS/3D LiDAR-SLAM integration scheme[C], 22-25(2021).

    [36] Segal A V, Haehnel D, Thrun S. Generalized-ICP[M]. Robotics, 161-168(2010).

    [37] Serafin J, Grisetti G. NICP: dense normal based point cloud registration[C], 742-749(2015).

    [38] Yokozuka M, Koide K, Oishi S et al. LiTAMIN: LiDAR-based tracking and mapping by stabilized ICP for geometry approximation with normal distributions[C], 5143-5150(2021).

    [39] Yokozuka M, Koide K, Oishi S et al. LiTAMIN2: ultra light LiDAR-based SLAM using geometric approximation applied with KL-divergence[C], 11619-11625(2021).

    [40] Koide K, Yokozuka M, Oishi S et al. Voxelized GICP for fast and accurate 3D point cloud registration[C], 11054-11059(2021).

    [41] Anderson S, Barfoot T D. RANSAC for motion-distorted 3D visual sensors[C], 2093-2099(2013).

    [42] Tong C H, Barfoot T D. Gaussian process Gauss‒Newton for 3D laser-based visual odometry[C], 5204-5211(2013).

    [43] Anderson S, Barfoot T D. Towards relative continuous-time SLAM[C], 1033-1040(2013).

    [44] Zhang J, Singh S. LOAM: lidar odometry and mapping in real-time[C], 12-16(2014).

    [45] Zhang J, Singh S. Low-drift and real-time lidar odometry and mapping[J]. Autonomous Robots, 41, 401-416(2017).

    [46] Zhou L P, Koppel D, Kaess M. LiDAR SLAM with plane adjustment for indoor environment[J]. IEEE Robotics and Automation Letters, 6, 7073-7080(2021).

    [47] Zhou L P, Wang S Z, Kaess M. π-LSAM: LiDAR smoothing and mapping with planes[C], 5751-5757(2021).

    [48] Martínez-Otzeta J M, Rodríguez-Moreno I, Mendialdua I et al. RANSAC for robotic applications: a survey[J]. Sensors, 23, 327(2022).

    [49] Raguram R, Frahm J M, Pollefeys M. A comparative analysis of RANSAC techniques leading to adaptive real-time random sample consensus[M]. Computer vision‒ECCV 2008, 5303, 500-513(2008).

    [50] Chum O, Matas J. Optimal randomized RANSAC[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 30, 1472-1482(2008).

    [51] Zhou L P, Huang G Q, Mao Y N et al. PLC-LiSLAM: LiDAR SLAM with planes, lines, and cylinders[J]. IEEE Robotics and Automation Letters, 7, 7163-7170(2022).

    [52] Wei X, Lv J X, Sun J et al. GCLO: ground constrained LiDAR odometry with low-drifts for GPS-denied indoor environments[C], 2229-2235(2022).

    [53] Zhou P W, Guo X X, Pei X F et al. T-LOAM: truncated least squares LiDAR-only odometry and mapping in real time[J]. IEEE Transactions on Geoscience and Remote Sensing, 60, 5701013(2021).

    [54] Pan Y, Xiao P C, He Y J et al. MULLS: versatile LiDAR SLAM via multi-metric linear least square[C], 11633-11640(2021).

    [55] Xu W, Cai Y X, He D J et al. FAST-LIO2: fast direct LiDAR-inertial odometry[J]. IEEE Transactions on Robotics, 38, 2053-2073(2022).

    [56] Yuan C J, Xu W, Liu X Y et al. Efficient and probabilistic adaptive voxel mapping for accurate online LiDAR odometry[J]. IEEE Robotics and Automation Letters, 7, 8518-8525(2022).

    [57] Bai C G, Xiao T, Chen Y J et al. Faster-LIO: lightweight tightly coupled lidar-inertial odometry using parallel sparse incremental voxels[J]. IEEE Robotics and Automation Letters, 7, 4861-4868(2022).

    [58] Vizzo I, Guadagnino T, Mersch B et al. KISS-ICP: in defense of point-to-point ICP‒simple, accurate, and robust registration if done the right way[J]. IEEE Robotics and Automation Letters, 8, 1029-1036(2023).

    [59] Huang K, Zhao J Q, Zhu Z Y et al. LOG-LIO: a LiDAR-inertial odometry with efficient local geometric information estimation[J]. IEEE Robotics and Automation Letters, 9, 459-466(2024).

    [60] Huang K, Zhao J Q, Lin J Y et al. LOG-LIO2: a LiDAR-inertial odometry with efficient uncertainty analysis[EB/OL]. https:∥arxiv.org/abs/2405.01316v2

    [61] Wang Q S, Zhang J, Liu Y S et al. High-precision and fast LiDAR odometry and mapping algorithm[J]. Journal of Advanced Computational Intelligence and Intelligent Informatics, 26, 206-216(2022).

    [62] Chen S B, Ma H, Jiang C H et al. NDT-LOAM: a real-time lidar odometry and mapping with weighted NDT and LFA[J]. IEEE Sensors Journal, 22, 3660-3671(2022).

    [63] Chen S W, Nardari G V, Lee E S et al. SLOAM: semantic lidar odometry and mapping for forest inventory[J]. IEEE Robotics and Automation Letters, 5, 612-619(2020).

    [64] Li L, Kong X, Zhao X R et al. SA-LOAM: semantic-aided LiDAR SLAM with loop closure[C], 7627-7634(2021).

    [65] Wang H, Wang C, Xie L H. Intensity-SLAM: intensity assisted localization and mapping for large scale environment[J]. IEEE Robotics and Automation Letters, 6, 1715-1721(2021).

    [66] Shan T X, Englot B. LeGO-LOAM: lightweight and ground-optimized lidar odometry and mapping on variable terrain[C], 4758-4765(2018).

    [67] Wang H, Wang C, Chen C L et al. F-LOAM: fast LiDAR odometry and mapping[C]. Czech Republic, 4390-4396(2021).

    [68] Dellenbach P, Deschaud J E, Jacquet B et al. CT-ICP: real-time elastic LiDAR odometry with loop closure[C], 5580-5586(2022).

    [69] Furgale P, Barfoot T D, Sibley G. Continuous-time batch estimation using temporal basis functions[C], 2088-2095(2012).

    [70] Zheng X, Zhu J K. Traj-LO: in defense of LiDAR-only odometry using an effective continuous-time trajectory[J]. IEEE Robotics and Automation Letters, 9, 1961-1968(2024).

    [71] Kaess M, Ranganathan A, Dellaert F. iSAM: incremental smoothing and mapping[J]. IEEE Transactions on Robotics, 24, 1365-1378(2008).

    [72] Kaess M, Johannsson H, Roberts R et al. iSAM2: incremental smoothing and mapping using the Bayes tree[J]. The International Journal of Robotics Research, 31, 216-235(2012).

    [73] Lin J R, Zhang F. LOAM livox: a fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV[C], 3126-3131(2020).

    [74] Jiao J H, Ye H Y, Zhu Y L et al. Robust odometry and mapping for multi-LiDAR systems with online extrinsic calibration[J]. IEEE Transactions on Robotics, 38, 351-371(2022).

    [75] Liu Z, Zhang F. BALM: bundle adjustment for lidar mapping[J]. IEEE Robotics and Automation Letters, 6, 3184-3191(2021).

    [76] Liu Z, Liu X Y, Zhang F. Efficient and consistent bundle adjustment on lidar point clouds[J]. IEEE Transactions on Robotics, 39, 4366-4386(2023).

    [77] Solà J, Deray J, Atchuthan D. A micro Lie theory for state estimation in robotics[EB/OL]. https:∥arxiv.org/abs/1812.01537v9

    [78] Barfoot T D[M]. State estimation for robotics(2017).

    [79] Xu W, Zhang F. FAST-LIO: a fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter[J]. IEEE Robotics and Automation Letters, 6, 3317-3324(2021).

    [80] Gao X, Xiao T, Bai C G et al. Anderson acceleration for on-manifold iterated error state Kalman filters[J]. IEEE Robotics and Automation Letters, 7, 12243-12250(2022).

    [81] He D J, Xu W, Chen N et al. Point-LIO: robust high-bandwidth light detection and ranging inertial odometry[J]. Advanced Intelligent Systems, 5, 2200459(2023).

    [82] Huang K, Zhao J Q, Lin J Y et al. LOG-LIO2: a LiDAR-inertial odometry with efficient uncertainty analysis[J]. IEEE Robotics and Automation Letters, 9, 8226-8233(2024).

    [83] Jung M, Jung S, Kim A. Asynchronous multiple LiDAR-inertial odometry using point-wise inter-LiDAR uncertainty propagation[J]. IEEE Robotics and Automation Letters, 8, 4211-4218(2023).

    [84] Geneva P, Eckenhoff K, Yang Y L et al. LIPS: LiDAR-inertial 3D plane SLAM[C], 123-130(2018).

    [85] Quenzel J, Behnke S. Real-time multi-adaptive-resolution-surfel 6D LiDAR odometry using continuous-time trajectory optimization[C]. Czech Republic, 5499-5506(2021).

    [86] Lü J J, Hu K W, Xu J H et al. CLINS: continuous-time trajectory estimation for LiDAR-inertial system[C]. Czech Republic, 6657-6663(2021).

    [87] Ramezani M, Khosoussi K, Catt G et al. Wildcat: online continuous-time 3D lidar-inertial SLAM[EB/OL]. https:∥arxiv.org/abs/2205.12595v1

    [88] Nguyen T M, Duberg D, Jensfelt P et al. SLICT: multi-input multi-scale surfel-based lidar-inertial continuous-time odometry and mapping[J]. IEEE Robotics and Automation Letters, 8, 2102-2109(2023).

    [89] Chen K, Nemiroff R, Lopez B T. Direct LiDAR-inertial odometry: lightweight LIO with continuous-time motion correction[C]. United Kingdom, 3983-3989(2023).

    [90] Zheng X, Zhu J K. Traj-LIO: a resilient multi-LiDAR multi-IMU state estimator through sparse Gaussian process[EB/OL]. https:∥arxiv.org/abs/2402.09189v1

    [91] Chen X, Vizzo I, Läbe T et al. Range image-based LiDAR localization for autonomous vehicles[C], 5802-5808(2021).

    [92] Dong H, Chen X, Stachniss C. Online range image-based pole extractor for long-term LiDAR localization in urban environments[C](2021).

    [93] Wang J[M]. Geometric structure of high-dimensional data and dimensionality reduction(2011).

    [94] Palieri M, Morrell B, Thakur A et al. LOCUS: a multi-sensor lidar-centric solution for high-precision odometry and 3D mapping in real-time[J]. IEEE Robotics and Automation Letters, 6, 421-428(2021).

    [95] Reinke A, Palieri M, Morrell B et al. LOCUS 2.0: robust and computationally efficient lidar odometry for real-time 3D mapping[J]. IEEE Robotics and Automation Letters, 7, 9043-9050(2022).

    [96] Li K L, Li M, Hanebeck U D. Towards high-performance solid-state-LiDAR-inertial odometry and mapping[J]. IEEE Robotics and Automation Letters, 6, 5167-5174(2021).

    [97] Badino H, Huber D, Park Y et al. Fast and accurate computation of surface normals from range images[C], 3084-3091(2011).

    [98] Bogoslavskyi I, Zampogiannis K, Phan R. Fast and robust normal estimation for sparse LiDAR scans[EB/OL]. https:∥arxiv.org/abs/2404.14281v1

    [99] Duberg D, Jensfelt P. UFOMap: an efficient probabilistic 3D mapping framework that embraces the unknown[J]. IEEE Robotics and Automation Letters, 5, 6411-6418(2020).

    [100] Ji X Y, Yuan S H, Yin P Y et al. LIO-GVM: an accurate, tightly-coupled lidar-inertial odometry with Gaussian voxel map[J]. IEEE Robotics and Automation Letters, 9, 2200-2207(2024).

    [101] Wu C, You Y, Yuan Y F et al. VoxelMap: mergeable voxel mapping method for online LiDAR (-inertial) odometry[J]. IEEE Robotics and Automation Letters, 9, 427-434(2024).

    [102] Thrun S, Burgard W, Fox D[M]. Probabilistic robotics(2005).

    [103] Barfoot T D, Furgale P T. Associating uncertainty with three-dimensional poses for use in estimation problems[J]. IEEE Transactions on Robotics, 30, 679-693(2014).

    [104] Smith R C, Cheeseman P. On the representation and estimation of spatial uncertainty[J]. The International Journal of Robotics Research, 5, 56-68(1986).

    [105] Ceolato R, Berg M J. Aerosol light extinction and backscattering: a review with a lidar perspective[J]. Journal of Quantitative Spectroscopy and Radiative Transfer, 262, 107492(2021).

    [106] Yuan C J, Liu X Y, Hong X P et al. Pixel-level extrinsic self calibration of high resolution LiDAR and camera in targetless environments[J]. IEEE Robotics and Automation Letters, 6, 7517-7524(2021).

    [107] Zhu Z M, Qu X H, Liang H Y et al. Uniform illumination study by light-emitting diode ring array and diffuse reflection surface[J]. Acta Optica Sinica, 31, 0115001(2011).

    [108] Fu J D, Chen Q, Qiu Y X et al. Indoor light environment simulation based on light energy transfer method[J]. Laser & Optoelectronics Progress, 56, 111502(2019).

    [109] Yang Y F, Zhang A, Guo Y C et al. Influence of target surface BRDF on non-line-of-sight imaging[J]. Laser & Optoelectronics Progress, 61, 1811003(2024).

    [110] Pfeifer N, Dorninger P, Haring A. Investigating terrestrial laser scanning intensity data: quality and functional relations[EB/OL]. https:∥publik.tuwien.ac.at/files/pub-geo_1932.pdf

    [111] Tan K, Cheng X J. TLS laser intensity correction based on polynomial model[J]. Chinese Journal of Lasers, 42, 0314002(2015).

    [112] Cheng X L, Cheng X J, Li Q et al. Laser intensity correction of terrestrial 3D laser scanning based on sectional polynomial model[J]. Laser & Optoelectronics Progress, 54, 112802(2017).

    [113] Li Q, Cheng X J, Tian R et al. Correction and normalization of multi-scan terrestrial three-dimensional laser scanning intensity[J]. Laser & Optoelectronics Progress, 54, 122802(2017).

    [114] Tasdizen T, Whitaker R. Cramer-Rao bounds for nonparametric surface reconstruction from range data[C], 70-77(2003).

    [115] Bae K H, Belton D, Lichti D D. A closed-form expression of the positional uncertainty for 3D point clouds[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31, 577-590(2009).

    [116] Yin J, Li A, Li T et al. M2DGR: a multi-sensor and multi-scenario SLAM dataset for ground robots[J]. IEEE Robotics and Automation Letters, 7, 2266-2273(2022).

    [117] Jiang B Q, Shen S J. A LiDAR-inertial odometry with principled uncertainty modeling[C], 13292-13299(2022).

    [118] Wang H, Wang C, Xie L H. Intensity scan context: coding intensity and geometry relations for loop closure detection[C], 2095-2101(2020).

    [119] Guo J D, Borges P V K, Park C et al. Local descriptor for robust place recognition using LiDAR intensity[J]. IEEE Robotics and Automation Letters, 4, 1470-1477(2019).

    [120] Di Giammarino L, Aloise I, Stachniss C et al. Visual place recognition using LiDAR intensity information[C]. Czech Republic, 4382-4389(2021).

    [121] Hata A, Wolf D. Road marking detection using LIDAR reflective intensity data and its application to vehicle localization[C], 584-589(2014).

    [122] Nguyen T M, Yuan S H, Cao M Q et al. NTU VIRAL: a visual-inertial-ranging-lidar dataset, from an aerial vehicle viewpoint[J]. The International Journal of Robotics Research, 41, 270-280(2022).

    [123] Helmberger M, Morin K, Berner B et al. The Hilti SLAM challenge dataset[J]. IEEE Robotics and Automation Letters, 7, 7518-7525(2022).

    [124] Ramezani M, Wang Y D, Camurri M et al. The newer college dataset: handheld LiDAR, inertial and vision with ground truth[C], 4353-4360(2021).

    Kai Huang, Junqiao Zhao, Tiantian Feng. Local Geometric Information Representation and Uncertainty Analysis in LiDAR SLAM[J]. Chinese Journal of Lasers, 2025, 52(6): 0600003
    Download Citation