Distributed Real-Time Scheduling in Cloud Manufacturing by Deep Reinforcement Learning

Lixiang Zhang, Chen Yang, Yan Yan, Yaoguang Hu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

36 Citations (Scopus)

Abstract

With the extensive application of automated guided vehicles, real-time production scheduling considering logistics services in cloud manufacturing (CM) becomes an urgent problem. Thus, this study focuses on the distributed real-time scheduling (DRTS) of multiple services to respond to dynamic and customized orders. First, a DRTS framework with cloud-edge collaboration is proposed to improve performance and satisfy responsiveness, where distributed actors and one centralized learner are deployed in the edge and cloud layer, respectively. And, the DRTS problem is modeled as a semi-Markov decision process, where the processing services sequencing and logistics services assignment are considered simultaneously. Then, we developed a distributed dueling deep Q network (D3QN) with cloud-edge collaboration to optimize the weighted tardiness of jobs. The experimental results show that the proposed D3QN obtains lower weighted tardiness and shorter flow-time than other state-of-the-art algorithms. It indicates the proposed DRTS method has significant potential to provide efficient real-time decision-making in CM.

Original languageEnglish
Pages (from-to)8999-9007
Number of pages9
JournalIEEE Transactions on Industrial Informatics
Volume18
Issue number12
DOIs
Publication statusPublished - 1 Dec 2022

Keywords

  • Cloud-edge collaboration
  • cloud manufacturing
  • deep reinforcement learning
  • distributed
  • real-time scheduling

Fingerprint

Dive into the research topics of 'Distributed Real-Time Scheduling in Cloud Manufacturing by Deep Reinforcement Learning'. Together they form a unique fingerprint.

Cite this