Visual Tracking by Sampling in Part Space

Lianghua Huang, Bo Ma*, Jianbing Shen, Hui He, Ling Shao, Fatih Porikli

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)

Abstract

In this paper, we present a novel part-based visual tracking method from the perspective of probability sampling. Specifically, we represent the target by a part space with two online learned probabilities to capture the structure of the target. The proposal distribution memorizes the historical performance of different parts, and it is used for the first round of part selection. The acceptance probability validates the specific tracking stability of each part in a frame, and it determines whether to accept its vote or to reject it. By doing this, we transform the complex online part selection problem into a probability learning one, which is easier to tackle. The observation model of each part is constructed by an improved supervised descent method and is learned in an incremental manner. Experimental results on two benchmarks demonstrate the competitive performance of our tracker against state-of-the-art methods.

Original languageEnglish
Article number8016595
Pages (from-to)5800-5810
Number of pages11
JournalIEEE Transactions on Image Processing
Volume26
Issue number12
DOIs
Publication statusPublished - Dec 2017

Keywords

  • Part space
  • Visual tracking
  • sampling

Fingerprint

Dive into the research topics of 'Visual Tracking by Sampling in Part Space'. Together they form a unique fingerprint.

Cite this