Hyperspectral image reconstruction using deep external and internal learning

Tao Zhang, Ying Fu, Lizhi Wang, Hua Huang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

60 Citations (Scopus)

Abstract

To solve the low spatial and/or temporal resolution problem which the conventional hypelrspectral cameras often suffer from, coded snapshot hyperspectral imaging systems have attracted more attention recently. Recovering a hyperspectral image (HSI) from its corresponding coded image is an ill-posed inverse problem, and learning accurate prior of HSI is essential to solve this inverse problem. In this paper, we present an effective convolutional neural network (CNN) based method for coded HSI reconstruction, which learns the deep prior from the external dataset as well as the internal information of input coded image with spatial-spectral constraint. Our method can effectively exploit spatial-spectral correlation and sufficiently represent the variety nature of HSIs. Experimental results show our method outperforms the state-of-the-art methods under both comprehensive quantitative metrics and perceptive quality.

Original languageEnglish
Title of host publicationProceedings - 2019 International Conference on Computer Vision, ICCV 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages8558-8567
Number of pages10
ISBN (Electronic)9781728148038
DOIs
Publication statusPublished - Oct 2019
Event17th IEEE/CVF International Conference on Computer Vision, ICCV 2019 - Seoul, Korea, Republic of
Duration: 27 Oct 20192 Nov 2019

Publication series

NameProceedings of the IEEE International Conference on Computer Vision
Volume2019-October
ISSN (Print)1550-5499

Conference

Conference17th IEEE/CVF International Conference on Computer Vision, ICCV 2019
Country/TerritoryKorea, Republic of
CitySeoul
Period27/10/192/11/19

Fingerprint

Dive into the research topics of 'Hyperspectral image reconstruction using deep external and internal learning'. Together they form a unique fingerprint.

Cite this