Distilling Siamese Trackers with Attention Mask

Han Sun, Yongqiang Bai, Wenbo Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

In recent years, the introduction of Siamese network has brought new vitality to the object tracking community. However, high-performance Siamese trackers cannot run at a real-time speed on mobile devices due to their complex and huge model. Knowledge distillation is a common and effective model compression method, but it is difficult to be applied to the challenging task like object tracking. We find out the fundamental cause is that the imbalance between the foreground and background in the object tracking task, which aggravates the problem of insufficient feature extraction ability of small backbone. Therefore, we propose the attention mask distillation (AMD) to help the student tracker focus on the foreground area faster and more accurately. The attention mask can be easily obtained from the feature maps and brings fine-granularity to the traditional binary mask. The experimental results on OTB 100 and VOT20 18 show that our method enables the student tracker perform as well as the teacher tracker. At the same time, it's able to run on the CPU at a hyper-real-time of 66 fps and achieves nearly 9 times model compression ratio. Such low computational and storage costs make it possible to deploy high-performance trackers on resource-constrained platforms.

Original languageEnglish
Title of host publicationProceedings of the 41st Chinese Control Conference, CCC 2022
EditorsZhijun Li, Jian Sun
PublisherIEEE Computer Society
Pages6622-6627
Number of pages6
ISBN (Electronic)9789887581536
DOIs
Publication statusPublished - 2022
Event41st Chinese Control Conference, CCC 2022 - Hefei, China
Duration: 25 Jul 202227 Jul 2022

Publication series

NameChinese Control Conference, CCC
Volume2022-July
ISSN (Print)1934-1768
ISSN (Electronic)2161-2927

Conference

Conference41st Chinese Control Conference, CCC 2022
Country/TerritoryChina
CityHefei
Period25/07/2227/07/22

Keywords

  • Attention Mask
  • Knowledge Distillation
  • Object Tracking
  • Siamese Network

Fingerprint

Dive into the research topics of 'Distilling Siamese Trackers with Attention Mask'. Together they form a unique fingerprint.

Cite this