A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration

Luzheng Bi*, A. Feleke, Cuntai Guan

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review

302 Citations (Scopus)

Abstract

Electromyography (EMG) signal is one of the widely used biological signals for human motor intention prediction, which is an essential element in human-robot collaboration systems. Studies on motor intention prediction from EMG signal have been concentrated on classification and regression models, and there are numerous review and survey papers on classification models. However, to the best of our knowledge, there is no review paper on regression models or continuous motion prediction from EMG signal. Therefore, in this paper, we provide a comprehensive review of EMG-based motor intention prediction of continuous human upper limb motion. This review will cover the models and approaches used in continuous motion estimation, the kinematic motion parameters estimated from EMG signal, and the performance metrics utilized for system validation. From the review, we will provide some insights into future research directions on these subjects. We first review the overall structure and components of EMG-based human-robot collaboration systems. We then discuss the state of arts in continuous motion prediction of the human upper limb. Finally, we conclude the paper with a discussion of the current challenges and future research directions.

Original languageEnglish
Pages (from-to)113-127
Number of pages15
JournalBiomedical Signal Processing and Control
Volume51
DOIs
Publication statusPublished - May 2019

Keywords

  • Continuous motion
  • Electromyography (EMG)
  • Human-robot collaboration
  • Intention prediction
  • Upper limb

Fingerprint

Dive into the research topics of 'A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration'. Together they form a unique fingerprint.

Cite this