StemNet: A Dataset, Benchmark and Method for Scene Recognition in Remote Sensing

Jinyu Li, Mengmeng Zhang*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This study addresses the challenges in remote sensing scene recognition, traditionally treated as an image classification problem, leading to issues with false positives and false negatives, especially in complex images. We propose a paradigm shift by framing scene recognition as an advanced object detection task and introduce a specialized dataset to assess models in realistic scenarios. Our approach includes StemNet, an innovative fusion technique integrating hyperspectral and RGB imagery, surpassing traditional methods in accuracy, precision, and robustness. Through extensive experimentation, StemNet consistently outperforms conventional techniques, offering a groundbreaking perspective and setting a benchmark for future methodologies in remote sensing scene recognition. The introduced dataset and StemNet contribute significantly to advancing research and practice in this field.

Original languageEnglish
Title of host publicationICIGP 2024 - Proceedings of the 2024 7th International Conference on Image and Graphics Processing
PublisherAssociation for Computing Machinery
Pages205-210
Number of pages6
ISBN (Electronic)9798400716720
DOIs
Publication statusPublished - 19 Jan 2024
Event7th International Conference on Image and Graphics Processing, ICIGP 2024 - Beijing, China
Duration: 19 Jan 202421 Jan 2024

Publication series

NameACM International Conference Proceeding Series

Conference

Conference7th International Conference on Image and Graphics Processing, ICIGP 2024
Country/TerritoryChina
CityBeijing
Period19/01/2421/01/24

Keywords

  • Object Detection
  • Remote Sensing
  • Scene Recognition

Fingerprint

Dive into the research topics of 'StemNet: A Dataset, Benchmark and Method for Scene Recognition in Remote Sensing'. Together they form a unique fingerprint.

Cite this