Machine-vision fused brain machine interface based on dynamic augmented reality visual stimulation

Deyu Zhang, Siyu Liu, Kai Wang, Jian Zhang*, Duanduan Chen, Yilong Zhang, Li Nie, Jiajia Yang, Funabashi Shinntarou, Jinglong Wu, Tianyi Yan*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

Objective. Brain-machine interfaces (BMIs) interpret human intent into machine reactions, and the visual stimulation (VS) paradigm is one of the most widely used of these approaches. Although VS-based BMIs have a relatively high information transfer rate (ITR), it is still difficult for BMIs to control machines in dynamic environments (for example, grabbing a dynamic object or targeting a walking person). Approach. In this study, we utilized a BMI based on augmented reality (AR) VS (AR-VS). The proposed VS was dynamically generated based on machine vision, and human intent was interpreted by a dynamic decision time interval approach. A robot based on the coordination of a task and self-motion system was controlled by the proposed paradigm in a fast and flexible state. Methods. Objects in scenes were first recognized by machine vision and tracked by optical flow. AR-VS was generated based on the objects' parameters. The number and distribution of VS was confirmed by the recognized objects. Electroencephalogram (EEG) features corresponding to VS and human intent were collected by a dry-electrode EEG cap and determined by the filter bank canonical correlation analysis method. Key parameters in the AR-VS, including the effect of VS size, frequency, dynamic object moving speed, ITR and the performance of the BMI-controlled robot, were analyzed. Conclusion and significance. The ITR of the proposed AR-VS paradigm for nine healthy subjects was 36.3 ± 20.1 bits min-1. In the online robot control experiment, brain-controlled hybrid tasks including self-moving and grabbing objects were 64% faster than when using the traditional steady-state visual evoked potential paradigm. The proposed paradigm based on AR-VS could be optimized and adopted in other kinds of VS-based BMIs, such as P300, omitted stimulus potential, and miniature event-related potential paradigms, for better results in dynamic environments.

Original languageEnglish
Article number056061
JournalJournal of Neural Engineering
Volume18
Issue number5
DOIs
Publication statusPublished - Oct 2021

Keywords

  • augmented reality
  • brain-machine interfaces
  • robot control

Fingerprint

Dive into the research topics of 'Machine-vision fused brain machine interface based on dynamic augmented reality visual stimulation'. Together they form a unique fingerprint.

Cite this