Novel Channel Attention Residual Network for Single Image Super-Resolution

Wenling Shi, Huiqian Du*, Wenbo Mei

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

A novel channel attention residual network (CAN) for SISR has been proposed to rescale pixel-wise features by explicitly modeling interdependencies between channels and encoding where the visual attention is located. The backbone of CAN is channel attention block (CAB). The proposed CAB combines cosine similarity block (CSB) and back-projection gating block (BG). CSB fully considers global spatial information of each channel and computes the cosine similarity between each channel to obtain finer channel statistics than the first-order statistics. For further exploration of channel attention, we introduce effective back-projection to the gating mechanism and propose BG. Meanwhile, we adopt local and global residual connections in SISR which directly convey most low-frequency information to the final SR outputs and valuable high-frequency components are allocated more computational resources through channel attention mechanism. Extensive experiments show the superiority of the proposed CAN over the state-of-the-art methods on benchmark datasets in both accuracy and visual quality.

Original languageEnglish
Pages (from-to)345-353
Number of pages9
JournalJournal of Beijing Institute of Technology (English Edition)
Volume29
Issue number3
DOIs
Publication statusPublished - 1 Sept 2020

Keywords

  • Back-projection
  • Cosine similarity
  • Residual network
  • Super-resolution

Fingerprint

Dive into the research topics of 'Novel Channel Attention Residual Network for Single Image Super-Resolution'. Together they form a unique fingerprint.

Cite this