• Acta Photonica Sinica
  • Vol. 53, Issue 4, 0415001 (2024)
Yufeng XU, Yuanzhi LIU, Minghui QIN, Hui ZHAO, and Wei TAO*
Author Affiliations
  • School of Sensing Science and Engineering,School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University,Shanghai 200240,China
  • show less
    DOI: 10.3788/gzxb20245304.0415001 Cite this Article
    Yufeng XU, Yuanzhi LIU, Minghui QIN, Hui ZHAO, Wei TAO. Global Low Bias Visual/inertial/weak-positional-aided Fusion Navigation System[J]. Acta Photonica Sinica, 2024, 53(4): 0415001 Copy Citation Text show less
    Overall framework of our method
    Fig. 1. Overall framework of our method
    Factor graph model
    Fig. 2. Factor graph model
    ArUco target coordinate system definition and coordinate transformation between camera and ArUco target
    Fig. 3. ArUco target coordinate system definition and coordinate transformation between camera and ArUco target
    Diagram of the experimental site
    Fig. 4. Diagram of the experimental site
    Wheeled robot platform and its sensor distribution
    Fig. 5. Wheeled robot platform and its sensor distribution
    Laser point cloud map generation and trajectory truth value generation
    Fig. 6. Laser point cloud map generation and trajectory truth value generation
    Navigation result trajectories of each method under three scenarios
    Fig. 7. Navigation result trajectories of each method under three scenarios
    Outdoor scene feature point tracking at night
    Fig. 8. Outdoor scene feature point tracking at night
    Examples of ArUco target distribution can be seen in the path and ArUco target detection
    Fig. 9. Examples of ArUco target distribution can be seen in the path and ArUco target detection
    Navigation trajectory comparison with or without ArUco assistance
    Fig. 10. Navigation trajectory comparison with or without ArUco assistance
    Type of sensorPictureModelFrequency/HzSpecification
    Gray cameraDALSA M193020Resolution: 960×600
    IMUXsens Mti-680G400Noise: 0.002°/s Hz-0.5 and 0.01 ms-2 Hz-0.5
    GNSSu-blox ZED-F9P-01B10Horizontal pos accuracy (PVT): 1.5 m CEP
    Ultrasonic tagMarvelmind Super-Beacon0.1~7Accuracy: 1%~3% (Base station distance)
    LiDARVelodyne VLP1610FoV: 360°×30° Accuracy: ±3 cm@100 m
    Table 1. System hardware parameter
    SequenceEvaluation indexError results (mean/median/RMSE)
    OursVINS-MonoORB-SLAM3
    IndoorRPE/%↓0.837/0.669/1.0441.401/1.317/1.5734.193/4.496/4.543
    ATE/m↓0.619/0.556/0.7403.359/3.587/3.5233.041/2.759/3.653
    Daytime in-outdoorRPE/%↓2.622/2.177/3.2343.039/2.382/3.7867.091/7.148/7.426
    ATE/m↓2.813/2.627/3.3255.017/4.867/5.4358.861/8.262/10.32
    Nighttime in-outdoorRPE/%↓2.028/1.743/2.2785.451/3.323/7.38710.32/9.899/11.61
    ATE/m↓2.988/3.108/3.4959.313/8.983/10.7712.55/10.62/15.02
    Table 2. Error test results under different experimental conditions
    Evaluation indexError results (mean/median/RMSE)
    VIO+ArUco (ours)VIO
    RPE/%↓3.518/3.182/4.0233.645/3.394/4.106
    ATE/m↓2.917/2.935/3.0456.412/7.132/6.683
    Table 3. Error test results with or without ArUco assistance
    IndoorDaytime outdoorNighttime outdoorDaytime in-outdoorNighttime in-outdoor
    Average time spent/ms29.0423.5127.3923.5427.95
    Table 4. The single processing run time of this method in different scenarios
    Yufeng XU, Yuanzhi LIU, Minghui QIN, Hui ZHAO, Wei TAO. Global Low Bias Visual/inertial/weak-positional-aided Fusion Navigation System[J]. Acta Photonica Sinica, 2024, 53(4): 0415001
    Download Citation