• Laser & Optoelectronics Progress
  • Vol. 60, Issue 16, 1610012 (2023)
Wenqing Liu, Renhua Wang*, Xiaowen Liu, and Xin Yang
Author Affiliations
  • Department of Information and Cyber Security, People's Public Security University of China, Beijing 100038, China
  • show less
    DOI: 10.3788/LOP222293 Cite this Article Set citation alerts
    Wenqing Liu, Renhua Wang, Xiaowen Liu, Xin Yang. Infrared and Visible Image Fusion Method Based on Saliency Target Extraction and Poisson Reconstruction[J]. Laser & Optoelectronics Progress, 2023, 60(16): 1610012 Copy Citation Text show less

    Abstract

    An infrared and visible image fusion method based on saliency target extraction and Poisson reconstruction is proposed to address the problems of incomplete saliency target, blurred edges, and low contrast in the fusion process of infrared and visible images in a low illumination environment. First, the saliency target was extracted using the correlation of saliency detection, threshold segmentation, and Gamma correction, which is based on the difference in saliency intensity between infrared image pixels, to separate the target from the background in infrared images. Second, the visual saliency features and gradient saliency of the source images were considered, and fused images were reconstructed by solving the Poisson equation in the gradient domain. Finally, the mean and standard deviation of the infrared images were used to optimize the fused images, thereby improving the quality of the results in a low illumination environment. Experimental results show that the proposed method is superior to other comparison methods in terms of subjective and objective evaluation performances. Moreover, the proposed method can better highlight infrared target information, retain rich background information, and have remarkable visual effects.
    Wenqing Liu, Renhua Wang, Xiaowen Liu, Xin Yang. Infrared and Visible Image Fusion Method Based on Saliency Target Extraction and Poisson Reconstruction[J]. Laser & Optoelectronics Progress, 2023, 60(16): 1610012
    Download Citation