Online gesture recognition algorithm applied to hud based smart driving system

Jingyao Wang*, Jing Chen, Yuanyuan Qiao, Junyan Zhou, Yongtian Wang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Citations (Scopus)

Abstract

In order to avoid driving distraction caused by traditional buttons or touch screens, gesture recognition technology has begun to be applied to the field of human-car interaction. Online gesture recognition system designed for vehicle needs powerful enough to satisfy the requirements of high classification accuracy, fast response time and low graphics memory consumption. To solve the above challenges, we propose an online gesture recognition algorithm based on RGB camera to identify motion, hand and gesture in sequence. We use the frame difference as a motion detection modality and apply the hand detection neural network to determine whether to activate the gesture classifier. In the gesture classifier, the frame difference is fused with the RGB image at data level based on Efficient Convolutional Network. We combined gesture recognition and Heads-Up Display to create a simulated driving system that allows users control auxiliary information through gestures, which used for usability analysis and user evaluation. For the purpose of finding the gestures that best match the various interactive tasks, we use the entropy weight method to analyze the usability of the gestures in the JESTER dataset and derive seven best gestures. The offline gesture classification accuracy on the JESTER dataset is 95.96% and online recognition algorithm runs on average at 306 fps when there is no motion and 164 fps in the presence of hand. According to the questionnaire results after the subjects used our system, more than 86.25% of the subjects expressed satisfaction with our gesture recognition system.

Original languageEnglish
Title of host publicationAdjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages289-294
Number of pages6
ISBN (Electronic)9781728147659
DOIs
Publication statusPublished - Oct 2019
Event18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019 - Beijing, China
Duration: 14 Oct 201918 Oct 2019

Publication series

NameAdjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019

Conference

Conference18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
Country/TerritoryChina
CityBeijing
Period14/10/1918/10/19

Keywords

  • Frame Difference
  • Heads Up Display
  • Human Car Interaction
  • Neural Networks
  • Online Gesture Recognition

Fingerprint

Dive into the research topics of 'Online gesture recognition algorithm applied to hud based smart driving system'. Together they form a unique fingerprint.

Cite this