Autonomous Human-Robot Collaborative Assembly Method Driven by the Fusion of Large Language Model and Digital Twin

Jianpeng Chen, Haiwei Luo, Sihan Huang*, Meidi Zhang, Guoxin Wang, Yan Yan, Shikai Jing

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

Human-robot collaboration (HRC) plays an important role in human-centric manufacturing, which requires cooperative robots to have the ability of collaborate with human autonomously. It is very complex to understand the intention of human during the assembly process, therefore, we proposed a method of autonomous HRC assembly driven by the fusion of large language model (LLM) and digital twin in this paper. The assembly state is recognized from two perspectives, including the perception of key parts based on transfer learning and YOLO, and perceive operator actions based on LSTM and attention mechanism. In order to improve the autonomy of HRC, a collaborative task decision method driven by fine-tuning LLM based on assembly domain knowledge is proposed. A case study of reducer assembly is presented to verify the effectiveness of the proposed method.

Original languageEnglish
Article number012004
JournalJournal of Physics: Conference Series
Volume2832
Issue number1
DOIs
Publication statusPublished - 2024
Event2024 International Conference on Intelligent Systems and Robotics, CISR 2024 - Dalian, China
Duration: 23 May 202426 May 2024

Fingerprint

Dive into the research topics of 'Autonomous Human-Robot Collaborative Assembly Method Driven by the Fusion of Large Language Model and Digital Twin'. Together they form a unique fingerprint.

Cite this

Chen, J., Luo, H., Huang, S., Zhang, M., Wang, G., Yan, Y., & Jing, S. (2024). Autonomous Human-Robot Collaborative Assembly Method Driven by the Fusion of Large Language Model and Digital Twin. Journal of Physics: Conference Series, 2832(1), Article 012004. https://doi.org/10.1088/1742-6596/2832/1/012004