[1] Z WANG, Y WU, Q NIU. Multi-sensor fusion in automated driving: a survey. IEEE Access, 8, 2847-2868(2020).
[2] C CHEN, H ZHU, M LI et al. A review of visual-inertial simultaneous localization and mapping from filtering-based and optimization-based perspectives. Robotics, 7, 45(2018).
[3] Yu ZHANG, Xiping XU, Ning ZHANG et al. Research on visual odometry based on catadioptric panoramic camera. Acta Photonica Sinica, 50, 0415002(2021).
[4] M LI, A I MOURIKIS. Improving the accuracy of EKF-based visual-inertial odometry, 828-835(2012).
[5] P GENEVA, K ECKENHOFF, W LEE et al. OpenVINS: a research platform for visual-inertial estimation, 4666-4672(2020).
[6] T QIN, P LI, S SHEN. VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 34, 1004-1020(2018).
[7] C CAMPOS, R ELVIRA, J J G RODRÍGUEZ et al. ORB-SLAM3: an accurate open-source library for visual, visual-inertial, and multimap SLAM. IEEE Transactions on Robotics, 37, 1874-1890(2021).
[8] K EBADI, Y CHANG, M PALIERI et al. LAMP: large-scale autonomous mapping and positioning for exploration of perceptually-degraded subterranean environments, 80-86(2020).
[9] T LI, L PEI, Y XIANG et al. P3-VINS: tightly-coupled PPP/INS/visual SLAM based on optimization approach. IEEE Robotics and Automation Letters, 7, 7021-7027(2022).
[10] Jinkui CHU, Jianhua CHEN, Jinshan LI et al. Polarized light/binocular vision bionic integrated navigation method. Acta Photonica Sinica, 50, 0528001(2021).
[11] Z GONG, P LIU, F WEN et al. Graph-based adaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment. IEEE Transactions on Instrumentation and Measurement, 70, 1-16(2021).
[12] T QIN, J PAN, S CAO et al. A general optimization-based framework for local odometry estimation with multiple sensors. arXiv Preprint(2019).
[13] S CAO, X LU, S SHEN. GVINS: tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation. IEEE Transactions on Robotics, 38, 2004-2021(2022).
[14] A ALARIFI, A AL-SALMAN, M ALSALEH et al. Ultra wideband indoor positioning technologies: analysis and recent advances. Sensors, 16, 707(2016).
[15] P LUTZ, M J SCHUSTER, F STEIDLE. Visual-inertial SLAM aided estimation of anchor poses and sensor error model parameters of UWB radio modules, 739-746(2019).
[16] G ZIZZO, L REN. Position tracking during human walking using an integrated wearable sensing system. Sensors, 17, 2866(2017).
[17] P OŠČÁDAL, D HECZKO, A VYSOCKÝ et al. Improved pose estimation of Aruco tags using a novel 3D placement strategy. Sensors, 20, 4825(2020).
[18] F J ROMERO-RAMIREZ, R MUÑOZ-SALINAS, R MEDINA-CARNICER. Speeded up detection of squared fiducial markers. Image and Vision Computing, 76, 38-47(2018).
[19] B PFROMMER, N SANKET, K DANIILIDIS et al. PennCOSYVIO: a challenging visual inertial odometry benchmark, 3847-3854(2017).
[20] Yue HU, Xu LI, Qimin XU et al. Reliable positioning method of intelligent vehicles based on factor graph in GNSS-denied environment. Chinese Journal of Scientific Instrument, 42, 79-86(2021).
[21] N KBAYER, M SAHMOUDI. Performances analysis of GNSS NLOS bias correction in urban environment using a three-dimensional city model and GNSS simulator. IEEE Transactions on Aerospace and Electronic Systems, 54, 1799-1814(2018).
[22] S WANG, X DONG, G LIU et al. GNSS RTK/UWB/DBA fusion positioning method and its performance evaluation. Remote Sensing, 14, 5928(2022).
[23] T QIN, S SHEN. Online temporal calibration for monocular visual-inertial systems, 3662-3669(2018).
[24] Y LIU, Y FU, M QIN et al. BotanicGarden: a high-quality and large-scale robot navigation dataset in challenging natural environments. arXiv Preprint(2023).
[25] J STURM, N ENGELHARD, F ENDRES et al. A benchmark for the evaluation of RGB-D SLAM systems, 573-580(2012).