Boosting-based visual tracking using structural local sparse descriptors

Yangbiao Liu, Bo Ma*, Hongwei Hu, Yin Han

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper develops an online algorithm based on sparse representation and boosting for robust object tracking. Local descriptors of a target object are represented by pooling some sparse codes of its local patches, and an Adaboost classifier is learned using the local descriptors to discriminate target from background. Meanwhile, the proposed algorithm assigns a weight value, calculated with the generativemodel, to each candidate object to adjust the classification result. In addition, a template update strategy, based on incremental principal component analysis and occlusion handing scheme, is presented to capture the appearance change of the target and to alleviate the visual drift problem. Comparison with the state-of-the-art trackers on the comprehensive benchmark shows effectiveness of the proposed method.

Original languageEnglish
Title of host publicationComputer Vision - ACCV 2014 - 12th Asian Conference on Computer Vision, Revised Selected Papers
EditorsDaniel Cremers, Hideo Saito, Ian Reid, Ming-Hsuan Yang
PublisherSpringer Verlag
Pages522-533
Number of pages12
ISBN (Electronic)9783319168135
DOIs
Publication statusPublished - 2015
Event12th Asian Conference on Computer Vision, ACCV 2014 - Singapore, Singapore
Duration: 1 Nov 20145 Nov 2014

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9007
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference12th Asian Conference on Computer Vision, ACCV 2014
Country/TerritorySingapore
CitySingapore
Period1/11/145/11/14

Fingerprint

Dive into the research topics of 'Boosting-based visual tracking using structural local sparse descriptors'. Together they form a unique fingerprint.

Cite this