Attention guided for partial domain adaptation

Changchun Zhang, Qingjie Zhao*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)

Abstract

How to effectively extract feature representations from unlabeled samples from the target domain is critical for unsupervised domain adaptation, specific for partial domain adaptation where source label space is a super-space of target label space, as it helps reduce large performance gap due to domain shift or domain bias. In this paper, a novel partial domain adaptation method named Multiple Self-Attention Networks (MSAN) based on adversarial learning is proposed. Unlike most existing partial domain adaptation methods which only focus on high-level features, MSAN focuses on effective high-level context features and low-level structural features from unlabeled target data with the help of labeled source data. Specifically, we present multiple self-attention network, a general approach to learning more fine-grained and transferable features in a manner of gradual feature enhancement so that domain shift can be relatively decreased to boost the model generalization power. Comprehensive experiments on Office-31 and Office-Home datasets demonstrate that the proposed approach significantly improves upon representation partial domain adaptation methods to yield state-of-the-art results for various partial transfer tasks.

Original languageEnglish
Pages (from-to)860-869
Number of pages10
JournalInformation Sciences
Volume547
DOIs
Publication statusPublished - 8 Feb 2021

Keywords

  • Adversarial Networks
  • Attention Mechanism
  • Partial Domain Adaptation

Fingerprint

Dive into the research topics of 'Attention guided for partial domain adaptation'. Together they form a unique fingerprint.

Cite this