Abrupt motion tracking using a visual saliency embedded particle filter

Yingya Su*, Qingjie Zhao, Liujun Zhao, Dongbing Gu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

60 Citations (Scopus)

Abstract

Abrupt motion is a significant challenge that commonly causes traditional tracking methods to fail. This paper presents an improved visual saliency model and integrates it to a particle filter tracker to solve this problem. Once the target is lost, our algorithm recovers tracking by detecting the target region from salient regions, which are obtained in the saliency map of current frame. In addition, to strengthen the saliency of target region, the target model is used as a prior knowledge to calculate a weight set which is utilized to construct our improved saliency map adaptively. Furthermore, we adopt the covariance descriptor as the appearance model to describe the object more accurately. Compared with several other tracking algorithms, the experimental results demonstrate that our method is more robust in dealing with various types of abrupt motion scenarios.

Original languageEnglish
Pages (from-to)1826-1834
Number of pages9
JournalPattern Recognition
Volume47
Issue number5
DOIs
Publication statusPublished - May 2014

Keywords

  • Abrupt motion
  • Covariance descriptor
  • Object tracking
  • Particle filter
  • Visual saliency

Fingerprint

Dive into the research topics of 'Abrupt motion tracking using a visual saliency embedded particle filter'. Together they form a unique fingerprint.

Cite this