Arm gesture analysis based on pose tracking and hidden conditional random fields

Fawang Liu*, Gangyi Ding, Yihua Xu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

We present a model-based probabilistic framework for arm gesture analysis in this paper. The methodology makes a tradeoff between precision and simplicity to extract 3D upper body pose through fusing particle filtering and model constraints. Existing recognition approaches typically use generative structures like Hidden Markov Models, but generative models often have to make unrealistic assumptions on the conditional independence and can not accommodate long term contextual dependencies. Moreover, generative models usually require a considerable number of observations for certain gesture classes and may not uncover the distinctive configuration that sets one gesture class uniquely against others. In our framework, we employ Hidden Conditional Random Fields to model and classify gestures in a discriminative formulation. Experimental results show that the proposed framework can track motions robustly and recognize arm activities accurately with temporal, intra-and inter-person variations.

Original languageEnglish
Pages (from-to)605-612
Number of pages8
JournalJournal of Information and Computational Science
Volume5
Issue number2
Publication statusPublished - Mar 2008

Keywords

  • Gesture analysis
  • Hidden conditional random fields
  • Pose tracking

Fingerprint

Dive into the research topics of 'Arm gesture analysis based on pose tracking and hidden conditional random fields'. Together they form a unique fingerprint.

Cite this