Visual Tracking by Sampling in Part Space

Lianghua Huang, Bo Ma*, Jianbing Shen, Hui He, Ling Shao, Fatih Porikli

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

15 引用 (Scopus)

摘要

In this paper, we present a novel part-based visual tracking method from the perspective of probability sampling. Specifically, we represent the target by a part space with two online learned probabilities to capture the structure of the target. The proposal distribution memorizes the historical performance of different parts, and it is used for the first round of part selection. The acceptance probability validates the specific tracking stability of each part in a frame, and it determines whether to accept its vote or to reject it. By doing this, we transform the complex online part selection problem into a probability learning one, which is easier to tackle. The observation model of each part is constructed by an improved supervised descent method and is learned in an incremental manner. Experimental results on two benchmarks demonstrate the competitive performance of our tracker against state-of-the-art methods.

源语言英语
文章编号8016595
页(从-至)5800-5810
页数11
期刊IEEE Transactions on Image Processing
26
12
DOI
出版状态已出版 - 12月 2017

指纹

探究 'Visual Tracking by Sampling in Part Space' 的科研主题。它们共同构成独一无二的指纹。

引用此