• Laser & Optoelectronics Progress
  • Vol. 59, Issue 14, 1415002 (2022)
Anhu Li†,*, Zhaojun Deng1,†, Xingsheng Liu, and Hao Chen
Author Affiliations
  • School of Mechanical Engineering, Tongji University, Shanghai 201804, China
  • show less
    DOI: 10.3788/LOP202259.1415002 Cite this Article Set citation alerts
    Anhu Li, Zhaojun Deng, Xingsheng Liu, Hao Chen. Research Progresses of Pose Estimation Based on Virtual Cameras[J]. Laser & Optoelectronics Progress, 2022, 59(14): 1415002 Copy Citation Text show less
    Schematic of a laser tracker[20]
    Fig. 1. Schematic of a laser tracker[20]
    Basic schematic diagram of mainstream inertial-unit-based pose estimation system[32-33]. (a) Pose estimation system based on laser inertial units; (b) pose estimation system based on fiber optic inertial units; (c) pose estimation system based on MEMS inertial units
    Fig. 2. Basic schematic diagram of mainstream inertial-unit-based pose estimation system[32-33]. (a) Pose estimation system based on laser inertial units; (b) pose estimation system based on fiber optic inertial units; (c) pose estimation system based on MEMS inertial units
    Virtual camera imaging system based on a bipartite prism. (a) Virtual viewpoint imaging system[55]; (b) virtual camera imaging system[56]
    Fig. 3. Virtual camera imaging system based on a bipartite prism. (a) Virtual viewpoint imaging system[55]; (b) virtual camera imaging system[56]
    Virtual camera imaging system based on a micro-prism array[64-65]. (a) System schematic; (b) reconstruction principle; (c) beam propagation principle; (d) system composition; (e) traditional endoscopic imaging; (f) virtual camera imaging
    Fig. 4. Virtual camera imaging system based on a micro-prism array[64-65]. (a) System schematic; (b) reconstruction principle; (c) beam propagation principle; (d) system composition; (e) traditional endoscopic imaging; (f) virtual camera imaging
    Virtual camera imaging system based on a grating diffraction[68]
    Fig. 5. Virtual camera imaging system based on a grating diffraction[68]
    Mirror-based adjustable virtual camera imaging system[70-72]. (a) Single mirror; (b) double mirrors; (c) triple mirrors; (d) triple mirrors with a beam splitter; (e) four mirrors; (f) dual mirrors with a prismatic mirror
    Fig. 6. Mirror-based adjustable virtual camera imaging system[70-72]. (a) Single mirror; (b) double mirrors; (c) triple mirrors; (d) triple mirrors with a beam splitter; (e) four mirrors; (f) dual mirrors with a prismatic mirror
    Schematic illustration of 3D imaging using FMCW LiDAR[81]. (a) System layout; (b) multi-beam scanning mechanism using Risley prisms
    Fig. 7. Schematic illustration of 3D imaging using FMCW LiDAR[81]. (a) System layout; (b) multi-beam scanning mechanism using Risley prisms
    Pose estimation system based on dynamic virtual cameras[86]
    Fig. 8. Pose estimation system based on dynamic virtual cameras86
    Basic principle of target pose estimation based on point feature[90]
    Fig. 9. Basic principle of target pose estimation based on point feature[90]
    Pose estimation based on CAD template matching[104]
    Fig. 10. Pose estimation based on CAD template matching[104]
    Basic principle of pose estimation method based on graph neural network[133]
    Fig. 11. Basic principle of pose estimation method based on graph neural network[133]
    NameAccuracyEfficiencyTargetRangeTouchingAdaptabilityDynamic measurementCost
    Coordinate measuring arm

    Angle:1″~2″

    Position:5-10 μm+3.3 μm/m

    +N≤5 mY+++N++
    Laser tracker

    Angle:1″~2″

    Position:10-20 μm+6 μm/m

    +++Y≤80 mN++Y+++
    Total station

    Angle:0.5″~1″

    Position:0.6 mm±1 mm/m

    ++Y≥100 mN+++N++
    IMU

    Angle:0.3°

    Position:2 cm

    ++NN++Y++
    Monocular+rangefinder

    Angle:1°

    Position:3 mm

    ++N≤30 mN++Y+
    Binocular

    Angle:0.2°

    Position:0.1 mm+1.2 mm/m

    ++N/Y≤10 mN+Y+
    Biprism-based systemPosition:≤2.36%+++N/Y≤50 mmN++Y+
    Mirror-based system

    Angle:≤1°

    Position:≤12 mm

    ++N/Y≥60 mmN++Y+
    Rotating-prism-based system

    Angle:≤0.4°

    Position:≤0.4 mm

    ++N/Y≥80 mmN+++Y+
    Table 1. Performance comparison of mainstream pose estimation systems
    CategoryMethodAccuracyTime consumingRobustnessOnline performanceScope of application
    Pose estimation based on 2D informationTarget feature++++++++++
    Template matching++++++++++
    Pose estimation based on 3D informationFeature matching+++++++++
    Absolute orientation++++/++++++
    Pose estimation based on deep learningDirect regression++++++++++
    Indirect regression+++++++++++
    Table 2. Comparison of mainstream pose estimation methods