Attention in Focus: Transformer-Powered Super-Resolution for Advanced Remote Sensing

Xinyu Yan, Qizhi Xu*, Jiuchen Chen

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Transformer-based approaches have demonstrated outstanding performance in natural language processing and computer vision tasks due to their ability to manage long-range dependencies. However, when applied to super-resolution of remote sensing images, transformer-based methods often produce overly smooth results that lack necessary textural details. To overcome this challenge, we developed the Multi-Attention Residual Transformer (MART). MART utilizes a Multi-Scale Attention Module to integrate information at different scales, effectively restoring the complex details in remote sensing images. With its hybrid attention mechanism, MART captures both local and global features efficiently. Comprehensive evaluations on various remote sensing datasets reveal that MART significantly enhances image quality. Compared to widely used advanced methods, MART excels in both qualitative and quantitative metrics, effectively restoring a wide range of landmark features.

Original languageEnglish
Title of host publication2024 IEEE International Conference on Control Science and Systems Engineering, ICCSSE 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages358-362
Number of pages5
ISBN (Electronic)9798331517199
DOIs
Publication statusPublished - 2024
Event2024 IEEE International Conference on Control Science and Systems Engineering, ICCSSE 2024 - Beijing, China
Duration: 18 Oct 202420 Oct 2024

Publication series

Name2024 IEEE International Conference on Control Science and Systems Engineering, ICCSSE 2024

Conference

Conference2024 IEEE International Conference on Control Science and Systems Engineering, ICCSSE 2024
Country/TerritoryChina
CityBeijing
Period18/10/2420/10/24

Keywords

  • attention mechanism
  • remote sensing
  • super-resolution
  • Transformer

Fingerprint

Dive into the research topics of 'Attention in Focus: Transformer-Powered Super-Resolution for Advanced Remote Sensing'. Together they form a unique fingerprint.

Cite this

Yan, X., Xu, Q., & Chen, J. (2024). Attention in Focus: Transformer-Powered Super-Resolution for Advanced Remote Sensing. In 2024 IEEE International Conference on Control Science and Systems Engineering, ICCSSE 2024 (pp. 358-362). (2024 IEEE International Conference on Control Science and Systems Engineering, ICCSSE 2024). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICCSSE63803.2024.10823881