TY - GEN
T1 - Static Hand Gesture Recognition for Human Robot Interaction
AU - Uwineza, Josiane
AU - Ma, Hongbin
AU - Li, Baokui
AU - Jin, Ying
N1 - Publisher Copyright:
© 2019, Springer Nature Switzerland AG.
PY - 2019
Y1 - 2019
N2 - Human-robot interaction is making a robot understanding human action, working and sharing the same space with the human. To achieve this, the communication between human and robot must be effective. The most used ways in this communication are the vocal and the body gestures such as full body actions, hand and arm gestures or head and facial gestures. Hand gestures are used as a natural and effective way of communicating between human and robot. The difference in hand size and posture, light variation and background complexity makes the hand gesture recognition to be a challenging issue. Different algorithms have been proposed and they gave better results. Yet, some problems such as poor computational scalability, trivial human intervenes and slow speed learning are still appearing in this field. In this paper, these issues are solved using a combination of three features extraction methods: Haralick texture, Hu moments, and color histogram and extreme learning machine (ELM) method for classification. The ELM results were compared to that of K-Nearest Neighbors, Random Forest Classifier, Linear Discriminant Analysis, Convolution Neural Networks and these experiment was evaluated on National University of Singapore (NUS) dataset II. ELM performed better than any of the above algorithms with an accuracy of 98.7 and time used of 109.7 s which is the proof of the satisfactory of the model.
AB - Human-robot interaction is making a robot understanding human action, working and sharing the same space with the human. To achieve this, the communication between human and robot must be effective. The most used ways in this communication are the vocal and the body gestures such as full body actions, hand and arm gestures or head and facial gestures. Hand gestures are used as a natural and effective way of communicating between human and robot. The difference in hand size and posture, light variation and background complexity makes the hand gesture recognition to be a challenging issue. Different algorithms have been proposed and they gave better results. Yet, some problems such as poor computational scalability, trivial human intervenes and slow speed learning are still appearing in this field. In this paper, these issues are solved using a combination of three features extraction methods: Haralick texture, Hu moments, and color histogram and extreme learning machine (ELM) method for classification. The ELM results were compared to that of K-Nearest Neighbors, Random Forest Classifier, Linear Discriminant Analysis, Convolution Neural Networks and these experiment was evaluated on National University of Singapore (NUS) dataset II. ELM performed better than any of the above algorithms with an accuracy of 98.7 and time used of 109.7 s which is the proof of the satisfactory of the model.
KW - Color histogram
KW - Extreme Learning Machine
KW - Haralick texture
KW - Hu moments
KW - Human robot interaction
UR - http://www.scopus.com/inward/record.url?scp=85070600964&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-27532-7_37
DO - 10.1007/978-3-030-27532-7_37
M3 - Conference contribution
AN - SCOPUS:85070600964
SN - 9783030275310
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 417
EP - 430
BT - Intelligent Robotics and Applications - 12th International Conference, ICIRA 2019, Proceedings
A2 - Yu, Haibin
A2 - Liu, Jinguo
A2 - Liu, Lianqing
A2 - Liu, Yuwang
A2 - Ju, Zhaojie
A2 - Zhou, Dalin
PB - Springer Verlag
T2 - 12th International Conference on Intelligent Robotics and Applications, ICIRA 2019
Y2 - 8 August 2019 through 11 August 2019
ER -