Abstract
Hand tracking is a challenging problem due to the complexity of searching in a 20+ degrees of freedom space for an optimal estimate. This paper develops a statistical method for robust visual hand tracking, in which graphical model decoupling different hand joints is performed to represent the hand constraints. Each node of the graphical model represents the position and the orientation of each hand joint in world coordinate. Then, the problem of hand tracking is transformed into an inference of graphical model. We extend Nonparametric Belief Propagation to a sequential process to track hand motion. The Experiment results show that this approach is robust for 3D hand motion tracking.
Original language | English |
---|---|
Pages (from-to) | 679-687 |
Number of pages | 9 |
Journal | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Volume | 3644 |
Issue number | PART I |
DOIs | |
Publication status | Published - 2005 |
Event | International Conference on Intelligent Computing, ICIC 2005 - Hefei, China Duration: 23 Aug 2005 → 26 Aug 2005 |