TY - JOUR
T1 - Brain-Controlled Hand Exoskeleton Based on Augmented Reality-Fused Stimulus Paradigm
AU - Liu, Siyu
AU - Liu, Mengzhen
AU - Zhang, Deyu
AU - Ming, Zhiyuan
AU - Liu, Ziyu
AU - Chen, Qiming
AU - Ma, Lingfei
AU - Luo, Jiawei
AU - Zhang, Jian
AU - Suo, Dingjie
AU - Pei, Guangying
AU - Yan, Tianyi
N1 - Publisher Copyright:
IEEE
PY - 2024
Y1 - 2024
N2 - Advancements in brain-machine interfaces (BMIs) have led to the development of novel rehabilitation training methods for people with impaired hand function. However, contemporary hand exoskeleton systems predominantly adopt passive control methods, leading to low system performance. In this work, an active brain-controlled hand exoskeleton system is proposed that uses a novel augmented reality-fused stimulus (AR-FS) paradigm as a human-machine interface, which enables users to actively control their fingers to move. Considering that the proposed AR-FS paradigm generates movement artifacts during hand movements, an enhanced decoding algorithm is designed to improve the decoding accuracy and robustness of the system. In online experiments, participants performed online control tasks using the proposed system, with an average task time cost of 16.27 s, an average output latency of 1.54 s, and an average correlation instantaneous rate (CIR) of 0.0321. The proposed system shows 35.37% better efficiency, 8.03% reduced system delay, and 35.28% better stability than the traditional system. This study not only provides an efficient rehabilitation solution for people with impaired hand function but also expands the application prospects of brain-control technology in areas such as human augmentation, patient monitoring, and remote robotic interaction. The video in Graphical Abstract Video demonstrates the user's process of operating the proposed brain-controlled hand exoskeleton system.
AB - Advancements in brain-machine interfaces (BMIs) have led to the development of novel rehabilitation training methods for people with impaired hand function. However, contemporary hand exoskeleton systems predominantly adopt passive control methods, leading to low system performance. In this work, an active brain-controlled hand exoskeleton system is proposed that uses a novel augmented reality-fused stimulus (AR-FS) paradigm as a human-machine interface, which enables users to actively control their fingers to move. Considering that the proposed AR-FS paradigm generates movement artifacts during hand movements, an enhanced decoding algorithm is designed to improve the decoding accuracy and robustness of the system. In online experiments, participants performed online control tasks using the proposed system, with an average task time cost of 16.27 s, an average output latency of 1.54 s, and an average correlation instantaneous rate (CIR) of 0.0321. The proposed system shows 35.37% better efficiency, 8.03% reduced system delay, and 35.28% better stability than the traditional system. This study not only provides an efficient rehabilitation solution for people with impaired hand function but also expands the application prospects of brain-control technology in areas such as human augmentation, patient monitoring, and remote robotic interaction. The video in Graphical Abstract Video demonstrates the user's process of operating the proposed brain-controlled hand exoskeleton system.
KW - Augmented Reality-Fused Stimulus (AR-FS) Paradigm
KW - Brain-Controlled Hand Exoskeleton
KW - Brain-Machine Interfaces (BMI)
KW - Machine Vision (MV)
UR - http://www.scopus.com/inward/record.url?scp=85194867932&partnerID=8YFLogxK
U2 - 10.1109/JBHI.2024.3406684
DO - 10.1109/JBHI.2024.3406684
M3 - Article
C2 - 38809723
AN - SCOPUS:85194867932
SN - 2168-2194
SP - 1
EP - 14
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
ER -