Fast hand posture classification using depth features extracted from random line segments

Weizhi Nai, Yue Liu*, David Rempel, Yongtian Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

43 Citations (Scopus)

Abstract

In this paper we propose a set of fast-computable depth features for static hand posture classification from a single depth image. The proposed features, which are extracted from pixels on randomly positioned line segments, are specially designed as low-level cues to be used in random forest classifier which combines the cues to discover high-level unseen informative structure in an infinite dimensional feature space. The proposed features, while being simple, can effectively capture both hand geometry shape and depth texture information. The accuracy and speed performance of the recognition algorithm based on the proposed features is evaluated with American Sign Language (ASL) finger spelling dataset and with two new hand posture datasets. The proposed algorithm has a recognition accuracy rate that is comparable to the state-of-the-art methods, while being much faster in both training and testing phases. Our implementation of the proposed algorithm runs at about 600fps using only one thread of an i7 CPU. A pre-trained demo program is available to public.

Original languageEnglish
Pages (from-to)1-10
Number of pages10
JournalPattern Recognition
Volume65
DOIs
Publication statusPublished - 1 May 2017

Keywords

  • Depth feature
  • Hand posture
  • Random forest

Fingerprint

Dive into the research topics of 'Fast hand posture classification using depth features extracted from random line segments'. Together they form a unique fingerprint.

Cite this