Human–robot skill transmission for mobile robot via learning by demonstration

Jiehao Li, Junzheng Wang*, Shoukun Wang, Chenguang Yang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

22 Citations (Scopus)

Abstract

This paper proposed a skill transmission technique for the mobile robot via learning by demonstration. When the material is transported to the designated location, the robot can show the human-like capabilities: autonomous tracking target. In this case, a skill transmission framework is designed, which the Kinect sensor is utilized to distinguish human activity recognition to create a planned path. Moreover, the dynamic movement primitive method is implemented to represent the teaching data, and the Gaussian mixture regression is utilized to encode the learning trajectory. Furthermore, in order to realize the accurate position control of trajectory tracking, a model predictive tracking control is investigated, where the recurrent neural network is used to eliminate the uncertain interaction. Finally, some experimental tasks using the mobile robot (BIT-6NAZA) are carried out to demonstrate the effectiveness of the developed techniques in real-world scenarios.

Original languageEnglish
Pages (from-to)23441-23451
Number of pages11
JournalNeural Computing and Applications
Volume35
Issue number32
DOIs
Publication statusPublished - Nov 2023

Keywords

  • Human–robot skill transfer
  • Imitation learning
  • Learning by demonstration
  • Mobile robot

Fingerprint

Dive into the research topics of 'Human–robot skill transmission for mobile robot via learning by demonstration'. Together they form a unique fingerprint.

Cite this