Brain-Controlled Hand Exoskeleton Based on Augmented Reality-Fused Stimulus Paradigm

Siyu Liu, Mengzhen Liu, Deyu Zhang, Zhiyuan Ming, Ziyu Liu, Qiming Chen, Lingfei Ma, Jiawei Luo, Jian Zhang, Dingjie Suo, Guangying Pei, Tianyi Yan

科研成果: 期刊稿件文章同行评审

摘要

Advancements in brain-machine interfaces (BMIs) have led to the development of novel rehabilitation training methods for people with impaired hand function. However, contemporary hand exoskeleton systems predominantly adopt passive control methods, leading to low system performance. In this work, an active brain-controlled hand exoskeleton system is proposed that uses a novel augmented reality-fused stimulus (AR-FS) paradigm as a human-machine interface, which enables users to actively control their fingers to move. Considering that the proposed AR-FS paradigm generates movement artifacts during hand movements, an enhanced decoding algorithm is designed to improve the decoding accuracy and robustness of the system. In online experiments, participants performed online control tasks using the proposed system, with an average task time cost of 16.27 s, an average output latency of 1.54 s, and an average correlation instantaneous rate (CIR) of 0.0321. The proposed system shows 35.37% better efficiency, 8.03% reduced system delay, and 35.28% better stability than the traditional system. This study not only provides an efficient rehabilitation solution for people with impaired hand function but also expands the application prospects of brain-control technology in areas such as human augmentation, patient monitoring, and remote robotic interaction. The video in Graphical Abstract Video demonstrates the user's process of operating the proposed brain-controlled hand exoskeleton system.

源语言英语
页(从-至)1-14
页数14
期刊IEEE Journal of Biomedical and Health Informatics
DOI
出版状态已接受/待刊 - 2024

指纹

探究 'Brain-Controlled Hand Exoskeleton Based on Augmented Reality-Fused Stimulus Paradigm' 的科研主题。它们共同构成独一无二的指纹。

引用此