TY - JOUR
T1 - Dependency-Aware Task Offloading Strategy via Heterogeneous Graph Neural Network and Deep Reinforcement Learning
AU - Wu, Jinming
AU - Zou, Yuan
AU - Zhang, Xudong
AU - Liu, Jiahui
AU - Sun, Wunjing
AU - Du, Guodong
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2025
Y1 - 2025
N2 - As the Internet of Things proliferates, cloud-assisted mobile edge computing (MEC) enables intelligent connected vehicles (ICVs) to offload their computationally intensive tasks to servers within the Internet of Vehicles, thereby reducing delay and energy consumption. However, most existing research on edge computing offloading overlooks the dependency relationships between subtasks. These dependencies significantly increase the complexity of task offloading, making it difficult to devise general solutions for scenarios of varying scales a challenging endeavor. To tackle this challenge, we present a heterogeneous graph attention network (HGAT) augmented deep reinforcement learning dependency-aware task offloading framework, aiming to achieve minimal task completion time and energy consumption. The dynamic system of vehicles and servers is modeled as an undirected graph, with nodes corresponding to servers/vehicles and edges capturing the intensity of task competition. Tasks are modeled as directed acyclic graphs, where nodes denote subtasks and directed edges define their dependencies. An HGAT-based encoder is then introduced to effectively capture the intricate relationships between subtasks and each servercores. Subtask selection and servercores assignment are formulated as a Markov Decision Process and solved using the Proximal Policy Optimization method. Simulation results demonstrate that the proposed algorithm outperforms existing ones across various scenarios, showcasing superior adaptability and performance benefits.
AB - As the Internet of Things proliferates, cloud-assisted mobile edge computing (MEC) enables intelligent connected vehicles (ICVs) to offload their computationally intensive tasks to servers within the Internet of Vehicles, thereby reducing delay and energy consumption. However, most existing research on edge computing offloading overlooks the dependency relationships between subtasks. These dependencies significantly increase the complexity of task offloading, making it difficult to devise general solutions for scenarios of varying scales a challenging endeavor. To tackle this challenge, we present a heterogeneous graph attention network (HGAT) augmented deep reinforcement learning dependency-aware task offloading framework, aiming to achieve minimal task completion time and energy consumption. The dynamic system of vehicles and servers is modeled as an undirected graph, with nodes corresponding to servers/vehicles and edges capturing the intensity of task competition. Tasks are modeled as directed acyclic graphs, where nodes denote subtasks and directed edges define their dependencies. An HGAT-based encoder is then introduced to effectively capture the intricate relationships between subtasks and each servercores. Subtask selection and servercores assignment are formulated as a Markov Decision Process and solved using the Proximal Policy Optimization method. Simulation results demonstrate that the proposed algorithm outperforms existing ones across various scenarios, showcasing superior adaptability and performance benefits.
KW - Deep reinforcement learning
KW - dependency-aware task offloading
KW - directed acyclic graph
KW - heterogeneous graph attention networks
KW - mobile edge computing
KW - undirected graph
UR - http://www.scopus.com/inward/record.url?scp=86000716056&partnerID=8YFLogxK
U2 - 10.1109/JIOT.2025.3549441
DO - 10.1109/JIOT.2025.3549441
M3 - Article
AN - SCOPUS:86000716056
SN - 2327-4662
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
ER -