Visual hand tracking using nonparametric sequential belief propagation

Wei Liang*, Yunde Jia, Cheng Ge

*此作品的通讯作者

科研成果: 期刊稿件会议文章同行评审

摘要

Hand tracking is a challenging problem due to the complexity of searching in a 20+ degrees of freedom space for an optimal estimate. This paper develops a statistical method for robust visual hand tracking, in which graphical model decoupling different hand joints is performed to represent the hand constraints. Each node of the graphical model represents the position and the orientation of each hand joint in world coordinate. Then, the problem of hand tracking is transformed into an inference of graphical model. We extend Nonparametric Belief Propagation to a sequential process to track hand motion. The Experiment results show that this approach is robust for 3D hand motion tracking.

源语言英语
页(从-至)679-687
页数9
期刊Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
3644
PART I
DOI
出版状态已出版 - 2005
活动International Conference on Intelligent Computing, ICIC 2005 - Hefei, 中国
期限: 23 8月 200526 8月 2005

指纹

探究 'Visual hand tracking using nonparametric sequential belief propagation' 的科研主题。它们共同构成独一无二的指纹。

引用此