Human–robot skill transmission for mobile robot via learning by demonstration

Jiehao Li, Junzheng Wang*, Shoukun Wang, Chenguang Yang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

21 引用 (Scopus)

摘要

This paper proposed a skill transmission technique for the mobile robot via learning by demonstration. When the material is transported to the designated location, the robot can show the human-like capabilities: autonomous tracking target. In this case, a skill transmission framework is designed, which the Kinect sensor is utilized to distinguish human activity recognition to create a planned path. Moreover, the dynamic movement primitive method is implemented to represent the teaching data, and the Gaussian mixture regression is utilized to encode the learning trajectory. Furthermore, in order to realize the accurate position control of trajectory tracking, a model predictive tracking control is investigated, where the recurrent neural network is used to eliminate the uncertain interaction. Finally, some experimental tasks using the mobile robot (BIT-6NAZA) are carried out to demonstrate the effectiveness of the developed techniques in real-world scenarios.

源语言英语
页(从-至)23441-23451
页数11
期刊Neural Computing and Applications
35
32
DOI
出版状态已出版 - 11月 2023

指纹

探究 'Human–robot skill transmission for mobile robot via learning by demonstration' 的科研主题。它们共同构成独一无二的指纹。

引用此