[2] Tan J B. Precise measurement: the cornerstone supporting the quality of high-end equipment manufacturing[J]. Zhangjiang Technology Review, 13-15(2020).
[3] Zhu J G. Precision measurement, the cornerstone of intelligent manufacturing[J]. Chinese Journal of Scientific Instrument, 38, 1821(2017).
[4] Chen H F, Tan Z, Shi Z Y et al. Optimization method for solution model of laser tracker multilateration measurement[J]. Measurement Science Review, 16, 205-210(2016).
[5] Oh T J, Kang J, Kim S et al. A practical 6D robot pose estimation using GPS and IMU in outdoor[C], 529-530(2012).
[6] Adams J K, Boominathan V, Avants B W et al. Single-frame 3D fluorescence microscopy with ultraminiature Lensless FlatScope[J]. Science Advances, 3, e1701548(2017).
[7] Sahin C, Garcia-Hernando G, Sock J et al. A review on object pose recovery: from 3D bounding box detectors to full 6D pose estimators[J]. Image and Vision Computing, 96, 103898(2020).
[8] Chen X L, Liang X X, Sun X Y et al. Workspace and statics analysis of 4-UPS-UPU parallel coordinate measuring machine[J]. Measurement, 55, 402-407(2014).
[9] Yu L D, Cao J M, Zhao H N et al. Kinematics model of articulated arm measuring machine[J]. Optics and Precision Engineering, 29, 2603-2612(2021).
[12] Yang L W, Bao H, Fan Y C et al. Parameter calibration of 6-dof parallel mechanism using orthogonal displacement measurement system[J]. Optics and Precision Engineering, 29, 316-328(2021).
[13] Zhang C Y, Jiang H Z. Rigid-flexible modal analysis of the hydraulic 6-DOF parallel mechanism[J]. Energies, 14, 1604(2021).
[14] Ceccarelli M, Carrasco C A, Ottaviano E. Error analysis and experimental tests of CATRASYS (Cassino Tracking System)[C], 2371-2376(2000).
[15] Varela M J, Ceccarelli M, Flores P. A kinematic characterization of human walking by using CaTraSys[J]. Mechanism and Machine Theory, 86, 125-139(2015).
[16] Cheng S L. Design of the calibration system based on draw-wire sensors for industrial robot[D](2014).
[17] Mei J P, Sun S J, Luo Z J et al. Positioning error analysis and kinematic calibration of robot palletizer based on one-dimensional cable measurement system[J]. Journal of Tianjin University (Science and Technology), 51, 748-754(2018).
[18] Qi F, Ping X L, Liu J et al. The parameter identification and error compensation of robot based on dynacal system[J]. Applied Mechanics and Materials, 701/702, 788-792(2014).
[19] Chen S J, Cheng L, Wang P J et al. Error analysis of autonomous calibration strategy based on roadheader attitude measurement system[J]. Journal of China Coal Society, 43, 2647-2652(2018).
[20] Muralikrishnan B, Phillips S, Sawyer D. Laser trackers for large-scale dimensional metrology: a review[J]. Precision Engineering, 44, 13-28(2016).
[21] Fan B, Ji Q S, Li M F et al. iGPS and laser tracker applications comparison in digital assembly of large aircraft parts[J]. Aeronautical Manufacturing Technology, 62, 57-62(2019).
[22] Jin Z J, Ke C S, Xiong R B et al. Thermal deformation compensation of laser tracker relocating in aircraft assembly[J]. International Journal of Precision Engineering and Manufacturing, 21, 641-647(2020).
[23] Jin Z J. Research on thermal compensation method of laser tracker re-location accuracy in aircraft assembly[J]. Journal of Aerospace Science and Technology, 7, 63-71(2019).
[24] Vikas, Sahu R K. A review on application of laser tracker in precision positioning metrology of particle accelerators[J]. Precision Engineering, 71, 232-249(2021).
[25] Zumbrunn R, Markendorf A, Loser R et al. Measurement system for determining six degrees of freedom of an object[P].
[26] Bridges R E. Camera based six degree-of-freedom target measuring and target tracking device with rotatable mirror[P].
[27] Bridges R E. Camera based six degree-of-freedom target measuring and target tracking device[P].
[28] Chen X Y, Zhang Q J, Sun Y L. Non-kinematic calibration of industrial robots using a rigid-flexible coupling error model and a full pose measurement method[J]. Robotics and Computer-Integrated Manufacturing, 57, 46-58(2019).
[29] Bai P J, Mei J P, Huang T et al. Kinematic calibration of Delta robot using distance measurements[J]. Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, 230, 414-424(2016).
[30] Ma T T, Ma K F, Zhang L W et al. Research on detection technology for body of large shield machine[J]. Mining & Processing Equipment, 46, 15-18(2018).
[31] Viña C, Morin P. Micro air vehicle local pose estimation with a two-dimensional laser scanner: a case study for electric tower inspection[J]. International Journal of Micro Air Vehicles, 10, 127-156(2018).
[32] Kok M, Hol J D, Schön T B. Using inertial sensors for position and orientation estimation[J]. Foundations & Trends in Signal Processing, 11, 1-153(2017).
[33] Fu M Y[M]. The magic world of inertia(2015).
[34] Ma J, Bajracharya M, Susca S et al. Real-time pose estimation of a dynamic quadruped in GPS-denied environments for 24-hour operation[J]. The International Journal of Robotics Research, 35, 631-653(2016).
[35] Zhu J C, Hu X P, Zhang J Y et al. The inertial attitude augmentation for ambiguity resolution in SF/SE-GNSS attitude determination[J]. Sensors, 14, 11395-11415(2014).
[36] Sabatini A M. Estimating three-dimensional orientation of human body parts by inertial/magnetic sensing[J]. Sensors, 11, 1489-1525(2011).
[37] Park Y B, Jeon H C, Park C G. Analysis of geometric effects on integrated inertial/vision for lunar descent navigation[J]. Journal of Guidance, Control, and Dynamics, 39, 937-943(2016).
[38] Zhang H, Ye C. Plane-aided visual-inertial odometry for 6-DOF pose estimation of a robotic navigation aid[J]. IEEE Access, 8, 90042-90051(2020).
[39] Jin F, Cai H G. Data fusion method of robot IMU and laser range finder based on synchronized scanner[J]. Robot, 22, 470-473(2000).
[40] Tsai C C, Lin H H, Lai S W. Multisensor 3D posture determination of a mobile robot using inertial and ultrasonic sensors[J]. Journal of Intelligent and Robotic Systems, 42, 317-335(2005).
[41] Ruiz A R J, Granja F S, Honorato J C P et al. Accurate pedestrian indoor navigation by tightly coupling foot-mounted IMU and RFID measurements[J]. IEEE Transactions on Instrumentation and Measurement, 61, 178-189(2012).
[42] Li H Y, Zhang Z L, Li C W et al. A survey of vision-based object pose estimation[C], 635-648(2020).
[43] Liu D Y, Arai S, Xu Y J et al. 6D pose estimation of occlusion-free objects for robotic bin-picking using PPF-MEAM with 2D images (occlusion-free PPF-MEAM)[J]. IEEE Access, 9, 50857-50871(2021).
[44] Peng L P, Zhao Y S, Qu S L et al. Real time and robust 6D pose estimation of RGBD data for robotic Bin picking[C], 5283-5288(2019).
[45] Gao S, Bai L Z. Monocular camera-based three-point laser pointer ranging and pose estimation method[J]. Acta Optica Sinica, 41, 0915001(2021).
[46] Luhmann T. Precision potential of photogrammetric 6DOF pose estimation with a single camera[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 64, 275-284(2009).
[47] Zhang L, Xu J F, Xia Q Y et al. An improvement and verification of position/attitude estimation algorithm based on binocular vision for unmanned aerial vehicle[J]. Journal of Shanghai Jiao Tong University, 49, 1387-1393(2015).
[48] Zhao C H, Fan B, Hu J W et al. Pose estimation for multi-camera systems[C], 533-538(2017).
[49] Harmat A, Trentini M, Sharf I. Multi-camera tracking and mapping for unmanned aerial vehicles in unstructured environments[J]. Journal of Intelligent & Robotic Systems, 78, 291-317(2015).
[50] Dinc S, Fahimi F, Aygun R. Mirage: an O(n) time analytical solution to 3D camera pose estimation with multi-camera support[J]. Robotica, 35, 2278-2296(2017).
[51] Sun B, Zhu J G, Yang L H et al. Calibration of line-scan cameras for precision measurement[J]. Applied Optics, 55, 6836-6843(2016).
[52] Kim Y K, Kim Y, Jung Y S et al. Developing accurate long-distance 6-DOF motion detection with one-dimensional laser sensors: three-beam detection system[J]. IEEE Transactions on Industrial Electronics, 60, 3386-3395(2013).
[53] Kim Y K, Jang I G, Kim K S et al. Design improvement of the three-beam detector towards a precise long-range 6-degree of freedom motion sensor system[J]. The Review of Scientific Instruments, 85, 015004(2014).
[54] Kim Y K, Jang I G, Kim Y et al. Structural optimization of a novel 6-DOF pose sensor system for enhancing noise robustness at a long distance[J]. IEEE Transactions on Industrial Electronics, 61, 5622-5631(2014).
[55] Lee D, Kweon I. A novel stereo camera system by a biprism[J]. IEEE Transactions on Robotics and Automation, 16, 528-541(2000).
[56] Lim K B, Xiao Y. Virtual stereovision system: new understanding on single-lens stereovision using a biprism[J]. Journal of Electronic Imaging, 14, 043020(2005).
[57] Lim K B, Wang D L, Kee W L. Virtual camera rectification with geometrical approach on single-lens stereovision using a biprism[J]. Journal of Electronic Imaging, 21, 023003(2012).
[58] Wei L K, Lim K B, Wang D L. Virtual epipolar line construction of single-lens Bi-prism system[J]. Journal of Electronic Science and Technology, 10, 97-101(2012).
[59] Lim K B, Qian B B. Biprism distortion modeling and calibration for a single-lens stereovision system[J]. Journal of the Optical Society of America A, 33, 2213-2224(2016).
[60] Kee W L, Bai Y D, Lim K B. Parameter error analysis of single-lens prism-based stereovision system[J]. Journal of the Optical Society of America A, 32, 367-373(2015).
[61] Cui X Y, Lim K B, Guo Q Y et al. Accurate geometrical optics model for single-lens stereovision system using a prism[J]. Journal of the Optical Society of America A, 29, 1828-1837(2012).
[62] Cui X Y, Lim K B, Zhao Y et al. Single-lens stereovision system using a prism: position estimation of a multi-ocular prism[J]. Journal of the Optical Society of America A, 31, 1074-1082(2014).
[63] Chen C Y, Yang T T, Sun W S. Optics system design applying a micro-prism array of a single lens stereo image pair[J]. Optics Express, 16, 15495-15505(2008).
[64] Yang S P, Kim J J, Jang K W et al. Compact stereo endoscopic camera using microprism arrays[J]. Optics Letters, 41, 1285-1288(2016).
[65] Yang S P, Kim J J, Jang K W et al. Microprism arrays based stereoscopic endoscope[C](2015).
[66] Zhong F Q, Quan C G. A single color camera stereo vision system[J]. IEEE Sensors Journal, 18, 1474-1482(2018).
[67] Trivi M, Rabal H J. Stereoscopic uses of diffraction gratings[J]. Applied Optics, 27, 1007-1009(1988).
[68] Pan B, Wang Q. Single-camera microscopic stereo digital image correlation using a diffraction grating[J]. Optics Express, 21, 25056-25068(2013).
[69] Xia S M, Pan Z P, Zhang J W. Optical microscope for three-dimensional surface displacement and shape measurements at the microscale[J]. Optics Letters, 39, 4267-4270(2014).
[70] Shaw A D, Neild S A, Wagg D J et al. Single source three dimensional capture of full field plate vibrations[J]. Experimental Mechanics, 52, 965-974(2012).
[71] Pachidis T P, Lygouras J N. Pseudostereo-vision system: a monocular stereo-vision system as a sensor for real-time robot applications[J]. IEEE Transactions on Instrumentation and Measurement, 56, 2547-2560(2007).
[72] Shao X X, Eisa M M, Chen Z N et al. Self-calibration single-lens 3D video extensometer for high-accuracy and real-time strain measurement[J]. Optics Express, 24, 30124-30138(2016).
[73] Sturm P, Bonfort T. How to compute the pose of an object without a direct view?[M]. Leonardis A, Bischof H, Pinz A. Computer vision-ACCV 2006, 3852, 21-31(2006).
[74] Takahashi K, Nobuhara S, Matsuyama T. Mirror-based camera pose estimation using an orthogonality constraint[J]. IPSJ Transactions on Computer Vision and Applications, 8, 11-19(2016).
[75] Li X, Long G C, Guo P Y et al. Accurate mirror-based camera pose estimation with explicit geometric meanings[J]. Science China Technological Sciences, 57, 2504-2513(2014).
[76] Long G C, Kneip L, Li X et al. Simplified mirror-based camera pose computation via rotation averaging[C], 1247-1255(2015).
[77] Liu W L, Wu S T, Wu X L. Pose estimation method for planar mirror based on one-dimensional target[J]. Optical Engineering, 57, 073101(2018).
[78] Takahashi K, Nobuhara S, Matsuyama T. A new mirror-based extrinsic camera calibration using an orthogonality constraint[C], 1051-1058(2012).
[79] Jang K W, Yang S P, Baek S H et al. Electrothermal MEMS parallel plate rotation for single-imager stereoscopic endoscopes[J]. Optics Express, 24, 9667-9672(2016).
[80] Rosell F A. Prism scanner[J]. Journal of the Optical Society of America, 50, 521-526(1960).
[81] Li A H, Liu X S, Sun J F et al. Risley-prism-based multi-beam scanning LiDAR for high-resolution three-dimensional imaging[J]. Optics and Lasers in Engineering, 150, 106836(2022).
[82] Li A H, Liu X S, Zhao Z S. Compact three-dimensional computational imaging using a dynamic virtual camera[J]. Optics Letters, 45, 3801-3804(2020).
[83] Li A H, Deng Z J, Liu X S et al. A cooperative camera surveillance method based on the principle of coarse-fine coupling boresight adjustment[J]. Precision Engineering, 66, 99-109(2020).
[84] Li A H, Zhao Z S, Liu X S et al. Risley-prism-based tracking model for fast locating a target using imaging feedback[J]. Optics Express, 28, 5378-5392(2020).
[85] Li A H[M]. Double-prism multi-mode scanning: principles and technology(2018).
[86] Deng Z J, Li A H, Liu X S. Equivalent virtual cameras to estimate a six-degree-of-freedom pose in restricted-space scenarios[J]. Measurement, 184, 109903(2021).
[87] He Z X, Feng W X, Zhao X Y et al. 6D pose estimation of objects: recent technologies and challenges[J]. Applied Sciences, 11, 228(2020).
[88] Hoque S, Arafat M Y, Xu S X et al. A comprehensive review on 3D object detection and 6D pose estimation with deep learning[J]. IEEE Access, 9, 143746-143770(2021).
[89] Sun F M, Wang B. The solution distribution analysis of the P3P problem[C], 2033-2036(2010).
[90] Wang P, Xu G L, Cheng Y H et al. A simple, robust and fast method for the perspective-n-point problem[J]. Pattern Recognition Letters, 108, 31-37(2018).
[91] Hartley R, Zisserman A[M]. Multiple view geometry in computer vision(2000).
[92] Lepetit V, Moreno-Noguer F, Fua P. EPnP: an accurate O(n) solution to the PnP problem[J]. International Journal of Computer Vision, 81, 155-166(2008).
[93] Ferraz L, Binefa X, Moreno-Noguer F. Very fast solution to the PnP problem with algebraic outlier rejection[C], 501-508(2014).
[94] Schweighofer G, Pinz A. Globally optimal O(n) solution to the PnP problem for general camera models[C], 1-10(2008).
[95] Lu C P, Hager G D, Mjolsness E. Fast and globally convergent pose estimation from video images[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22, 610-622(2000).
[96] Hesch J A, Roumeliotis S I. A direct least-squares (DLS) method for PnP[C], 383-390(2011).
[97] Zheng Y Q, Kuang Y B, Sugimoto S et al. Revisiting the PnP problem: a fast, general and optimal solution[C], 2344-2351(2013).
[98] Zheng Y Q, Sugimoto S, Okutomi M. ASPnP: an accurate and scalable solution to the perspective-n-point problem[J]. IEICE Transactions on Information and Systems, E96.D, 1525-1535(2013).
[99] Li S Q, Xu C, Xie M. A robust O(n) solution to the perspective-n-point problem[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34, 1444-1450(2012).
[100] Chen H H. Pose determination from line-to-plane correspondences: existence condition and closed-form solutions[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13, 530-541(1991).
[101] Taylor C J, Kriegman D J. Structure and motion from line segments in multiple images[C], 1615-1620(1992).
[102] Ansar A, Daniilidis K. Linear pose estimation from points or lines[C], 282-296(2002).
[103] Zhao Z Q, Ye D, Chen G et al. Binocular vision method of measuring pose based on perpendicular lines[J]. Acta Optica Sinica, 34, 1015003(2014).
[104] Ulrich M, Wiedemann C, Steger C. Combining scale-space and similarity-based aspect graphs for fast 3D object recognition[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34, 1902-1914(2012).
[105] Wang Y F. Design and implementation of a monocular flying target pose measurement system[D](2015).
[106] Hinterstoisser S, Holzer S, Cagniart C et al. Multimodal templates for real-time detection of texture-less objects in heavily cluttered scenes[C], 858-865(2011).
[107] Muñoz E, Konishi Y, Beltran C et al. Fast 6D pose from a single RGB image using Cascaded Forests Templates[C], 4062-4069(2016).
[108] Lin J J, Rickert M, Knoll A. 6D pose estimation for flexible production with small lot sizes based on CAD models using Gaussian process implicit surfaces[C], 10572-10579(2021).
[109] Song K T, Wu C H, Jiang S Y. CAD-based pose estimation design for random bin picking using a RGB-D camera[J]. Journal of Intelligent & Robotic Systems, 87, 455-470(2017).
[110] He Z X, Jiang Z W, Zhao X Y et al. Sparse template-based 6-D pose estimation of metal parts using a monocular camera[J]. IEEE Transactions on Industrial Electronics, 67, 390-401(2020).
[111] Li X L, Wang H, Yi L et al. Category-level articulated object pose estimation[C], 3703-3712(2020).
[112] Wang K K, Xie J, Zhang G F et al. Sequential 3D human pose and shape estimation from point clouds[C], 7273-7282(2020).
[113] Xue F, Lu W S, Chen Z et al. From LiDAR point cloud towards digital twin city: Clustering city objects based on Gestalt principles[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 167, 418-431(2020).
[114] Thoma J, Paudel D P, Chhatkuli A et al. Geometrically mappable image features[J]. IEEE Robotics and Automation Letters, 5, 2062-2069(2020).
[115] Cheng J Y, Wang C Q, Meng M Q H. Robust visual localization in dynamic environments based on sparse motion removal[J]. IEEE Transactions on Automation Science and Engineering, 17, 658-669(2020).
[116] Rusu R B, Blodow N, Beetz M. Fast point feature histograms (FPFH) for 3D registration[C], 3212-3217(2009).
[117] Rusu R B, Bradski G, Thibaux R et al. Fast 3D recognition and pose using the viewpoint feature histogram[C], 2155-2162(2010).
[118] Li P, Wang J, Zhao Y D et al. Improved algorithm for point cloud registration based on fast point feature histograms[J]. Journal of Applied Remote Sensing, 10, 045024(2016).
[119] do Monte Lima J P S, Teichrieb V. An efficient global point cloud descriptor for object recognition and pose estimation[C], 56-63(2016).
[120] Yu H S, Fu Q, Yang Z G et al. Robust robot pose estimation for challenging scenes with an RGB-D camera[J]. IEEE Sensors Journal, 19, 2217-2229(2019).
[121] Coleman S, Kerr D, Scotney B. Concurrent edge and corner detection[C], V273-V276(2007).
[122] Shang Y, Yu Q F. Vision-based disturbance-rejecting methods for space targets pose measurements[J]. Journal of Astronautics, 29, 938-942, 976(2008).
[123] Horn B K P, Hilden H M, Negahdaripour S. Closed-form solution of absolute orientation using orthonormal matrices[J]. Journal of the Optical Society of America A, 5, 1127-1135(1988).
[124] Horn B K P. Closed-form solution of absolute orientation using unit quaternions[J]. Journal of the Optical Society of America A, 4, 629-642(1987).
[125] Wang Y B, Wang Y J, Wu K et al. A dual quaternion-based, closed-form pairwise registration algorithm for point clouds[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 94, 63-69(2014).
[126] Besl P J, McKay N D. A method for registration of 3-D shapes[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14, 239-256(1992).
[127] Chen Y, Medioni G. Object modelling by registration of multiple range images[J]. Image and Vision Computing, 10, 145-155(1992).
[128] Yang J L, Li H D, Campbell D et al. Go-ICP: a globally optimal solution to 3D ICP point-set registration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38, 2241-2254(2016).
[129] Segal A, Haehnel D, Thrun S. Generalized-ICP[C](2009).
[130] Servos J, Waslander S L. Multi-Channel Generalized-ICP: a robust framework for multi-channel scan registration[J]. Robotics and Autonomous Systems, 87, 247-257(2017).
[131] Chen C W, Wang J, Shieh M D. Edge-based meta-ICP algorithm for reliable camera pose estimation[J]. IEEE Access, 9, 89020-89028(2021).
[132] Chen J L, Zhang L J, Liu Y et al. Survey on 6D pose estimation of rigid object[C], 7440-7445(2020).
[133] Yin P S, Ye J Y, Lin G S et al. Graph neural network for 6D object pose estimation[J]. Knowledge-Based Systems, 218, 106839(2021).
[134] Li Y, Wang G, Ji X Y et al. DeepIM: deep iterative matching for 6D pose estimation[M]. Ferrari V, Hebert M, Sminchisescu C, et al. Computer vision-ECCV 2018, 11210, 695-711(2018).
[135] Chen Z, Yang W, Xu Z B et al. DCNet: dense correspondence neural network for 6DoF object pose estimation in occluded scenes[C], 3929-3937(2020).
[136] Zuo G Y, Zhang C W, Liu H X et al. 6D object pose estimation for low-quality rendering images[J]. Control and Decision, 37, 135-141(2022).
[137] Manhardt F, Arroyo D M, Rupprecht C et al. Explaining the ambiguity of object detection and 6D pose from visual data[C], 6840-6849(2019).
[138] Tekin B, Sinha S N, Fua P. Real-time seamless single shot 6D object pose prediction[C], 292-301(2018).
[139] Rad M, Lepetit V. BB8: a scalable, accurate, robust to partial occlusion method for predicting the 3D poses of challenging objects without using depth[C], 3848-3856(2017).
[140] Peng S D, Zhou X W, Liu Y et al. PVNet: pixel-wise voting network for 6DoF object pose estimation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44, 3212-3223(2022).
[141] Hu Y L, Hugonot J, Fua P et al. Segmentation-driven 6D object pose estimation[C], 3380-3389(2019).
[142] Park K, Patten T, Vincze M. Pix2Pose: pixel-wise coordinate regression of objects for 6D pose estimation[C], 7667-7676(2019).
[143] Cui H H, Jiang T, Du K P et al. 3D imaging method for multi-view structured light measurement via deep learning pose estimation[J]. Acta Optica Sinica, 41, 1712001(2021).
[144] Sun D Q, Duan H X, Pei H D et al. Pose measurement method of space non-cooperative targets based on TOF camera[J]. Acta Optica Sinica, 41, 2212003(2021).