Grasp What You See: Toward Fine-Grained Brain-Controlled Robotic Arm Manipulation in 3D IoT Scenarios

  • Zhiyuan Ming
  • , Yilun Huang
  • , Mengzhen Liu
  • , Jian Zhang
  • , Siyu Liu*
  • , Tianyi Yan
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Brain-controlled robotic arms (BCRAs) have emerged as promising end-effectors in Internet of Things (IoT) environments, enabling direct neural control of physical interactions in remote and complex scenarios. However, existing BCRA systems face challenges in achieving precise fine-grained control and adapting to diverse manipulation tasks. To address these limitations, this study proposes a novel BCRA system with Human-Centered Visual Evoked Potential (HC-VEP) paradigm. By leveraging vision-based spatial mapping, the system enables fine-grained, coordinate-level manipulation of the BCRA in complex 3D environments. To further enhance system performance, Foveal Attention Tracking (FAT) is integrated to rapidly estimate the user's intended grasp location, thereby improving interaction efficiency. Additionally, a Time-Frequency Domain Enhanced Network (TFDE-Net) is developed to improve electroencephalogram (EEG) decoding accuracy through advanced time-frequency feature extraction. Experimental results demonstrate the effectiveness of the proposed system. Offline evaluations show that TFDE-Net achieves a peak information transfer rate (ITR) of 111.81 bits/min, representing a 20.8% improvement over EEGNet. Online experiments in simulated environments demonstrate the efficiency of our paradigm. Specifically, HC-VEP with FAT reduces task completion time by 56.86% compared to HC-VEP without FAT, and by 68.17% compared to traditional SSVEP. Real-world validation experiments with physical robotic arms achieved an overall success rate of 90.0% in unshielded laboratory environments, demonstrating the system's robustness under realistic operating conditions. These findings validate the system's capability for flexible and efficient grasping of arbitrary objects in complex 3D environments, marking a important step toward in practical BCRA applications.

Original languageEnglish
JournalIEEE Transactions on Mobile Computing
DOIs
Publication statusAccepted/In press - 2026

Keywords

  • Brain-Computer interface
  • Internet of Things
  • brain-controlled robotic arm
  • human-centered visual evoked potential
  • time-frequency domain enhanced network

Fingerprint

Dive into the research topics of 'Grasp What You See: Toward Fine-Grained Brain-Controlled Robotic Arm Manipulation in 3D IoT Scenarios'. Together they form a unique fingerprint.

Cite this