Image restoration from patch-based compressed sensing measurement

Hua Huang, Guangtao Nie, Yinqiang Zheng, Ying Fu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)

Abstract

A series of methods have been proposed to restore an image from compressive sensing (CS) based random measurements, but most of them have high time complexity and are inappropriate for patch-based CS capture, because of their serious blocky artifacts in the restoration results. In this paper, we first present a compact network module on the basis of the residual convolution neural network (CNN), which is effective for image reconstruction from non-overlapping patch-based CS random measurements and for blocky artifact removal. Later, we further design an end-to-end network for joint image patch reconstruction and blocky artifact removal, without a separated de-blocky step. By introducing a coding layer into this end-to-end network, we are capable of learning the optimal compressive coding, rather than using Gaussian distribution based random sampling. Experimental results show that our proposed networks outperform the state-of-the-art CS restoration methods with patch-based CS random measurements on synthetic and real data. More importantly, under the learned optimal CS coding, the restoration results could be significantly improved over using traditional random sampling. To demonstrate the effectiveness of our residual CNN based network module in a more general setting, we apply the de-blocky process of our method to JPEG compression artifact removal and achieve outstanding performance as well.

Original languageEnglish
Pages (from-to)145-157
Number of pages13
JournalNeurocomputing
Volume340
DOIs
Publication statusPublished - 7 May 2019

Keywords

  • Blocky artifact removal
  • Compressive sensing
  • Convolution neural network
  • Patch-based image restoration

Fingerprint

Dive into the research topics of 'Image restoration from patch-based compressed sensing measurement'. Together they form a unique fingerprint.

Cite this