Adaptive Nonlocal Sparse Representation for Dual-Camera Compressive Hyperspectral Imaging

Lizhi Wang, Zhiwei Xiong, Guangming Shi, Feng Wu, Wenjun Zeng

Research output: Contribution to journalArticlepeer-review

145 Citations (Scopus)

Abstract

Leveraging the compressive sensing (CS) theory, coded aperture snapshot spectral imaging (CASSI) provides an efficient solution to recover 3D hyperspectral data from a 2D measurement. The dual-camera design of CASSI, by adding an uncoded panchromatic measurement, enhances the reconstruction fidelity while maintaining the snapshot advantage. In this paper, we propose an adaptive nonlocal sparse representation (ANSR) model to boost the performance of dual-camera compressive hyperspectral imaging (DCCHI). Specifically, the CS reconstruction problem is formulated as a 3D cube based sparse representation to make full use of the nonlocal similarity in both the spatial and spectral domains. Our key observation is that, the panchromatic image, besides playing the role of direct measurement, can be further exploited to help the nonlocal similarity estimation. Therefore, we design a joint similarity metric by adaptively combining the internal similarity within the reconstructed hyperspectral image and the external similarity within the panchromatic image. In this way, the fidelity of CS reconstruction is greatly enhanced. Both simulation and hardware experimental results show significant improvement of the proposed method over the state-of-the-art.

Original languageEnglish
Article number7676344
Pages (from-to)2104-2111
Number of pages8
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume39
Issue number10
DOIs
Publication statusPublished - 1 Oct 2017
Externally publishedYes

Keywords

  • Compressive sensing
  • dual-camera
  • hyperspectral imaging
  • nonlocal similarity
  • sparse representation

Fingerprint

Dive into the research topics of 'Adaptive Nonlocal Sparse Representation for Dual-Camera Compressive Hyperspectral Imaging'. Together they form a unique fingerprint.

Cite this