• Optoelectronic Technology
  • Vol. 43, Issue 2, 166 (2023)
Yunfeng DAI1, Yajie DING1, Xingming FENG1, Han WANG1, and Qinghua WANG2
Author Affiliations
  • 1Yancheng Power Supply Branch, State Grid Jiangsu Power Supply Co., Ltd, Yancheng Jiangsu 224000, CHN
  • 2Changzhou Zhongneng Power Technology Co., Ltd, Changzhou Jiangsu 13000, CHN
  • show less
    DOI: 10.19453/j.cnki.1005-488x.2023.02.010 Cite this Article
    Yunfeng DAI, Yajie DING, Xingming FENG, Han WANG, Qinghua WANG. Calibration Method of Lidar and Visible Light Camera Based on EPnP Algorithm[J]. Optoelectronic Technology, 2023, 43(2): 166 Copy Citation Text show less

    Abstract

    Aiming at the problems of cumbersome calibration process and low calibration accuracy of lidar and visible light camera, a joint calibration method of lidar and visible light camera based on EPnP algorithm was proposed. Firstly, an image feature points extraction method based on prior information was proposed. The coordinates of feature points in visible light image were calculated accurately by using the color information and the edge contour information of the calibration plate; Secondly, the feature points in 3D point cloud were extracted accurately according to the size information and edge contour information of the calibration plate by using the feature points extraction method based on line fitting proposed in this paper; Finally, the matching feature points extracted from the synchronous visible light image and the lidar three-dimensional point cloud were used as the input of the EPnP algorithm to solve the external parameter matrix between the lidar and the visible light camera. The experimental results showed that, compared with the Autoware calibration toolbox widely used in the industry, the proposed method could achieve higher calibration accuracy and simplify the tedious calibration process.
    Yunfeng DAI, Yajie DING, Xingming FENG, Han WANG, Qinghua WANG. Calibration Method of Lidar and Visible Light Camera Based on EPnP Algorithm[J]. Optoelectronic Technology, 2023, 43(2): 166
    Download Citation