Infrared and visible image fusion via multi-layer convolutional sparse representation
CSTR:
Author:
Affiliation:

(1.College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China; 2.College of Civil Aviation, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China)

Clc Number:

V249.3,TP391

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Integrating the advantages of infrared and visible images by image fusion is an effective means to enhance the applicability of optical images in low illumination conditions. Despite the wide application of sparse representation (SR) theory in the field of infrared and visible image fusion, the drawbacks including detail loss and low toleration with mis-registration caused by the local patch representation nature of SR have never been effectively solved. Different from SR, the global representation capability of the recently emerged convolutional sparse representation (CSR) model reveals huge potential to overcome the above mentioned deficiencies. Drawing on the convolutional neural network (CNN) architecture, a multi-layer CSR model was designed for pixel level image fusion. The image fusion model was constructed with five layers in a forward-feeding manner: the first two layers are CSR layers which acquire sparse coefficient maps with response to the pre-learned dictionary filter sets; the third layer is fusion layer which obtains fused results of the sparse coefficient maps; the last two are reconstruction layers which reconstruct the fused image step by step, and the fusion results are thus obtained. Experimental results indicate that the image fusion method proposed in this paper can effectively overcome the two drawbacks of SR. The method outperforms SR, CSR, and CNN in the aspect of objective assessment metrics, and outperforms SR and CNN in terms of computation complexity and computation time.

    Reference
    Related
    Cited by
Get Citation
Related Videos

Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:May 11,2020
  • Revised:
  • Adopted:
  • Online: December 15,2021
  • Published:
Article QR Code