Abrupt motion tracking using a visual saliency embedded particle filter

Yingya Su*, Qingjie Zhao, Liujun Zhao, Dongbing Gu

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

60 引用 (Scopus)

摘要

Abrupt motion is a significant challenge that commonly causes traditional tracking methods to fail. This paper presents an improved visual saliency model and integrates it to a particle filter tracker to solve this problem. Once the target is lost, our algorithm recovers tracking by detecting the target region from salient regions, which are obtained in the saliency map of current frame. In addition, to strengthen the saliency of target region, the target model is used as a prior knowledge to calculate a weight set which is utilized to construct our improved saliency map adaptively. Furthermore, we adopt the covariance descriptor as the appearance model to describe the object more accurately. Compared with several other tracking algorithms, the experimental results demonstrate that our method is more robust in dealing with various types of abrupt motion scenarios.

源语言英语
页(从-至)1826-1834
页数9
期刊Pattern Recognition
47
5
DOI
出版状态已出版 - 5月 2014

指纹

探究 'Abrupt motion tracking using a visual saliency embedded particle filter' 的科研主题。它们共同构成独一无二的指纹。

引用此