Distractor-aware deep regression for visual tracking

Ming Du, Yan Ding*, Xiuyun Meng, Hua Liang Wei, Yifan Zhao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

In recent years, regression trackers have drawn increasing attention in the visual-object tracking community due to their favorable performance and easy implementation. The tracker algorithms directly learn mapping from dense samples around the target object to Gaussian-like soft labels. However, in many real applications, when applied to test data, the extreme imbalanced distribution of training samples usually hinders the robustness and accuracy of regression trackers. In this paper, we propose a novel effective distractor-aware loss function to balance this issue by highlighting the significant domain and by severely penalizing the pure background. In addition, we introduce a full differentiable hierarchy-normalized concatenation connection to exploit abstractions across multiple convolutional layers. Extensive experiments were conducted on five challenging benchmark-tracking datasets, that is, OTB-13, OTB-15, TC-128, UAV-123, and VOT17. The experimental results are promising and show that the proposed tracker performs much better than nearly all the compared state-of-the-art approaches.

Original languageEnglish
Article number387
JournalSensors
Volume19
Issue number2
DOIs
Publication statusPublished - 2 Jan 2019

Keywords

  • Data imbalance
  • Deep-regression networks
  • Distractor aware
  • Object tracking

Fingerprint

Dive into the research topics of 'Distractor-aware deep regression for visual tracking'. Together they form a unique fingerprint.

Cite this