AI-Enabled Offloading Decision-Making and Resource Allocation Optimization Under Emergency Scenarios

Mengqian Cheng, Xiaoqin Song, Lei Lei*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

In this paper, we present an AI-enabled approach for multi-server computation offloading in aerial computing networks, with a focus on addressing the challenges of computation-intensive services in emergency scenarios for 6G. Our method utilizes unmanned aerial vehicles as computing platforms to serve emergency vehicle users. We firstly construct the network architecture and define the optimization objective, which aims to minimize the total system delay while satisfying several constraints. To solve this problem, we introduce an improved dueling double deep Q network algorithm that incorporates dueling networks and prioritized experience replay. Numerical simulation results demonstrate the effectiveness of our approach, compared to the traditional DDQN algorithm, the proposed algorithm can reduce the total system delay by about 25%.

Original languageEnglish
Title of host publication2023 IEEE Globecom Workshops, GC Wkshps 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1734-1739
Number of pages6
ISBN (Electronic)9798350370218
DOIs
Publication statusPublished - 2023
Externally publishedYes
Event2023 IEEE Globecom Workshops, GC Wkshps 2023 - Kuala Lumpur, Malaysia
Duration: 4 Dec 20238 Dec 2023

Publication series

Name2023 IEEE Globecom Workshops, GC Wkshps 2023

Conference

Conference2023 IEEE Globecom Workshops, GC Wkshps 2023
Country/TerritoryMalaysia
CityKuala Lumpur
Period4/12/238/12/23

Keywords

  • AI model
  • aerial computing networks
  • dueling double deep Q-network
  • multi-access edge computing

Fingerprint

Dive into the research topics of 'AI-Enabled Offloading Decision-Making and Resource Allocation Optimization Under Emergency Scenarios'. Together they form a unique fingerprint.

Cite this