TY - GEN
T1 - Online gesture recognition algorithm applied to hud based smart driving system
AU - Wang, Jingyao
AU - Chen, Jing
AU - Qiao, Yuanyuan
AU - Zhou, Junyan
AU - Wang, Yongtian
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/10
Y1 - 2019/10
N2 - In order to avoid driving distraction caused by traditional buttons or touch screens, gesture recognition technology has begun to be applied to the field of human-car interaction. Online gesture recognition system designed for vehicle needs powerful enough to satisfy the requirements of high classification accuracy, fast response time and low graphics memory consumption. To solve the above challenges, we propose an online gesture recognition algorithm based on RGB camera to identify motion, hand and gesture in sequence. We use the frame difference as a motion detection modality and apply the hand detection neural network to determine whether to activate the gesture classifier. In the gesture classifier, the frame difference is fused with the RGB image at data level based on Efficient Convolutional Network. We combined gesture recognition and Heads-Up Display to create a simulated driving system that allows users control auxiliary information through gestures, which used for usability analysis and user evaluation. For the purpose of finding the gestures that best match the various interactive tasks, we use the entropy weight method to analyze the usability of the gestures in the JESTER dataset and derive seven best gestures. The offline gesture classification accuracy on the JESTER dataset is 95.96% and online recognition algorithm runs on average at 306 fps when there is no motion and 164 fps in the presence of hand. According to the questionnaire results after the subjects used our system, more than 86.25% of the subjects expressed satisfaction with our gesture recognition system.
AB - In order to avoid driving distraction caused by traditional buttons or touch screens, gesture recognition technology has begun to be applied to the field of human-car interaction. Online gesture recognition system designed for vehicle needs powerful enough to satisfy the requirements of high classification accuracy, fast response time and low graphics memory consumption. To solve the above challenges, we propose an online gesture recognition algorithm based on RGB camera to identify motion, hand and gesture in sequence. We use the frame difference as a motion detection modality and apply the hand detection neural network to determine whether to activate the gesture classifier. In the gesture classifier, the frame difference is fused with the RGB image at data level based on Efficient Convolutional Network. We combined gesture recognition and Heads-Up Display to create a simulated driving system that allows users control auxiliary information through gestures, which used for usability analysis and user evaluation. For the purpose of finding the gestures that best match the various interactive tasks, we use the entropy weight method to analyze the usability of the gestures in the JESTER dataset and derive seven best gestures. The offline gesture classification accuracy on the JESTER dataset is 95.96% and online recognition algorithm runs on average at 306 fps when there is no motion and 164 fps in the presence of hand. According to the questionnaire results after the subjects used our system, more than 86.25% of the subjects expressed satisfaction with our gesture recognition system.
KW - Frame Difference
KW - Heads Up Display
KW - Human Car Interaction
KW - Neural Networks
KW - Online Gesture Recognition
UR - http://www.scopus.com/inward/record.url?scp=85078729145&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct.2019.00-26
DO - 10.1109/ISMAR-Adjunct.2019.00-26
M3 - Conference contribution
AN - SCOPUS:85078729145
T3 - Adjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
SP - 289
EP - 294
BT - Adjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
Y2 - 14 October 2019 through 18 October 2019
ER -