Extracting and Transferring Hierarchical Knowledge to Robots Using Virtual Reality

Zhenliang Zhang, Jie Guo, Dongdong Weng, Yue Liu, Yongtian Wang

科研成果: 书/报告/会议事项章节会议稿件同行评审

2 引用 (Scopus)

摘要

We study the knowledge transfer problem by training a task of folding clothes in the virtual world using an Oculus Headset and validating with a physical Baxter robot. We argue such complex transfer is realizable if an abstract graph-based knowledge representation is adopted to facilitate the process. An And-Or-Graph (AOG) grammar model is introduced to represent the knowledge, which can be learned from the human demonstrations performed in the Virtual Reality (VR), followed by the case analysis of folding clothes represented and learned by the AOG grammar model. In the experiment, the learned knowledge from the given six virtual scenarios is implemented on a physical robot platform, demonstrating that the grammar-based knowledge is an effective representation.

源语言英语
主期刊名Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020
出版商Institute of Electrical and Electronics Engineers Inc.
669-670
页数2
ISBN(电子版)9781728165325
DOI
出版状态已出版 - 3月 2020
活动2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020 - Atlanta, 美国
期限: 22 3月 202026 3月 2020

出版系列

姓名Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020

会议

会议2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020
国家/地区美国
Atlanta
时期22/03/2026/03/20

指纹

探究 'Extracting and Transferring Hierarchical Knowledge to Robots Using Virtual Reality' 的科研主题。它们共同构成独一无二的指纹。

引用此