Machine Learning-Assisted Gesture Sensor Made with Graphene/Carbon Nanotubes for Sign Language Recognition

Hao Yuan Shen, Yu Tao Li*, Hang Liu, Jie Lin, Lu Yu Zhao, Guo Peng Li, Yi Wen Wu, Tian Ling Ren*, Yeliang Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Gesture sensors are essential to collect human movements for human-computer interfaces, but their application is normally hampered by the difficulties in achieving high sensitivity and an ultrawide response range simultaneously. In this article, inspired by the spider silk structure in nature, a novel gesture sensor with a core-shell structure is proposed. The sensor offers a high gauge factor of up to 340 and a wide response range of 60%. Moreover, the sensor combining with a deep learning technique creates a system for precise gesture recognition. The system demonstrated an impressive 99% accuracy in single gesture recognition tests. Meanwhile, by using the sliding window technology and large language model, a high performance of 97% accuracy is achieved in continuous sentence recognition. In summary, the proposed high-performance sensor significantly improves the sensitivity and response range of the gesture recognition sensor. Meanwhile, the neural network technology is combined to further improve the way of daily communication by sign language users.

Original languageEnglish
Pages (from-to)52911-52920
Number of pages10
JournalACS Applied Materials and Interfaces
Volume16
Issue number39
DOIs
Publication statusPublished - 2 Oct 2024

Keywords

  • gesture recognition
  • gesture sensor
  • high sensitivity
  • machine learning
  • wide strain range

Cite this