Efficient Aerial Image Object Detection with Imaging Condition Decomposition

Ren Jin*, Zikai Jia, Zhaochen Chu

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Object detection in aerial images faces domain adaptive challenges, such as changes in shooting height, viewing angle, and weather. These changes constitute a large number of fine-grained domains that place greater demands on network's generalizability. To tackle these challenges, we propose a submodule named Fine-grained Feature Disentanglement which decomposes image features into domain-invariant and domain-specific using practical imaging condition parameters. The composite feature can improve the domain generalization and single domain accuracy compared to the conventional fine-grained domain detection method. The proposed algorithm is compared with state-of-the-art fine-grained domain detectors on the UAVDT and VisDrone datasets. The results show that it achieves an average detection precision improvement of 5.7 and 2.4, respectively.

Original languageEnglish
Title of host publication2023 IEEE International Conference on Image Processing, ICIP 2023 - Proceedings
PublisherIEEE Computer Society
Pages620-624
Number of pages5
ISBN (Electronic)9781728198354
DOIs
Publication statusPublished - 2023
Event30th IEEE International Conference on Image Processing, ICIP 2023 - Kuala Lumpur, Malaysia
Duration: 8 Oct 202311 Oct 2023

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference30th IEEE International Conference on Image Processing, ICIP 2023
Country/TerritoryMalaysia
CityKuala Lumpur
Period8/10/2311/10/23

Keywords

  • Aerial image
  • Feature decomposition
  • Imaging condition
  • Object detection

Fingerprint

Dive into the research topics of 'Efficient Aerial Image Object Detection with Imaging Condition Decomposition'. Together they form a unique fingerprint.

Cite this