TY - GEN
T1 - Skeleton Based Dynamic Hand Gesture Recognition using Short Term Sampling Neural Networks (STSNN)
AU - Ikram, Aamrah
AU - Liu, Yue
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023.
PY - 2023
Y1 - 2023
N2 - This research introduces an innovative framework for real-time dynamic hand gesture recognition in the field of Human-Computer Interaction (HCI). The framework combines depth learning networks with the integration of multiple datasets to extract both short-term and long-term features from video input. A significant contribution of this research lies in the integration of Convolutional Neural Networks (CNNs) into a specialized short-term memory network (STSNN), enabling the capture of long-term contextual information for accurate gesture recognition. The proposed framework is thoroughly evaluated using two hand-held databases, namely the 14/28 dataset and the LDMI database. By leveraging the computational power of depth learning networks and the fusion of diverse datasets, our model outperforms previous methods, establishing its efficacy in real-time dynamic hand gesture recognition tasks. The outcomes of this research significantly contribute to the advancement of HCI, providing a robust and technically sophisticated solution for gesture-based interfaces. The findings hold promise for enhancing user experiences and facilitating seamless integration of gesture-based interaction techniques across various domains, ultimately improving the efficiency and effectiveness of human-computer interactions.
AB - This research introduces an innovative framework for real-time dynamic hand gesture recognition in the field of Human-Computer Interaction (HCI). The framework combines depth learning networks with the integration of multiple datasets to extract both short-term and long-term features from video input. A significant contribution of this research lies in the integration of Convolutional Neural Networks (CNNs) into a specialized short-term memory network (STSNN), enabling the capture of long-term contextual information for accurate gesture recognition. The proposed framework is thoroughly evaluated using two hand-held databases, namely the 14/28 dataset and the LDMI database. By leveraging the computational power of depth learning networks and the fusion of diverse datasets, our model outperforms previous methods, establishing its efficacy in real-time dynamic hand gesture recognition tasks. The outcomes of this research significantly contribute to the advancement of HCI, providing a robust and technically sophisticated solution for gesture-based interfaces. The findings hold promise for enhancing user experiences and facilitating seamless integration of gesture-based interaction techniques across various domains, ultimately improving the efficiency and effectiveness of human-computer interactions.
KW - Augmented Reality
KW - Depth Sensor
KW - Dynamic Hand Gesture Recognition
KW - Human-Computer Interaction
UR - http://www.scopus.com/inward/record.url?scp=85177430027&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-46305-1_30
DO - 10.1007/978-3-031-46305-1_30
M3 - Conference contribution
AN - SCOPUS:85177430027
SN - 9783031463044
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 368
EP - 379
BT - Image and Graphics - 12th International Conference, ICIG 2023, Proceedings
A2 - Lu, Huchuan
A2 - Liu, Risheng
A2 - Ouyang, Wanli
A2 - Huang, Hui
A2 - Lu, Jiwen
A2 - Dong, Jing
A2 - Xu, Min
PB - Springer Science and Business Media Deutschland GmbH
T2 - 12th International Conference on Image and Graphics, ICIG 2023
Y2 - 22 September 2023 through 24 September 2023
ER -