Attention Fusion Mechanism for Domain Adaptive Object Detection

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Its purpose is to alleviate performance degradation caused by domain-shift. However, most previous methods in the design of domain classifiers is often too simple. These methods input different scale feature maps from the network to independent domain classifiers, which do not effectively utilize the relationships between feature maps. Based on the shortcomings of the above methods, we designed a new attention network for adaptive object detection. Our method proposes an attention-based fusion domain classifier. This classifier inputs multi-scale feature maps and utilizes an attention mechanism to generate an attention map that fuses deep-layer feature maps with shallow-layer feature maps, thereby enhancing the domain classifier's discriminative ability. In this way, the network can obtain different levels of global structure representation and local texture mode. We test the target detection tasks on different challenging datasets. The experimental results prove the effectiveness of the method.

Original languageEnglish
Title of host publicationProceedings of the 44th Chinese Control Conference, CCC 2025
EditorsJian Sun, Hongpeng Yin
PublisherIEEE Computer Society
Pages8139-8144
Number of pages6
ISBN (Electronic)9789887581611
DOIs
Publication statusPublished - 2025
Externally publishedYes
Event44th Chinese Control Conference, CCC 2025 - Chongqing, China
Duration: 28 Jul 202530 Jul 2025

Publication series

NameChinese Control Conference, CCC
ISSN (Print)1934-1768
ISSN (Electronic)2161-2927

Conference

Conference44th Chinese Control Conference, CCC 2025
Country/TerritoryChina
CityChongqing
Period28/07/2530/07/25

Keywords

  • attention mechanism
  • domain adaptation
  • Object Detection

Fingerprint

Dive into the research topics of 'Attention Fusion Mechanism for Domain Adaptive Object Detection'. Together they form a unique fingerprint.

Cite this