TY - JOUR
T1 - NOMA-based energy-efficient task scheduling in vehicular edge computing networks
T2 - A self-imitation learning-based approach
AU - Dong, Peiran
AU - Ning, Zhaolong
AU - Ma, Rong
AU - Wang, Xiaojie
AU - Hu, Xiping
AU - Hu, Bin
N1 - Publisher Copyright:
© 2013 China Institute of Communications.
PY - 2020/11
Y1 - 2020/11
N2 - Mobile Edge Computing (MEC) is promising to alleviate the computation and storage burdens for terminals in wireless networks. The huge energy consumption of MEC servers challenges the establishment of smart cities and their service time powered by rechargeable batteries. In addition, Orthogonal Multiple Access (OMA) technique cannot utilize limited spectrum resources fully and efficiently. Therefore, Non-Orthogonal Multiple Access (NOMA)-based energy-efficient task scheduling among MEC servers for delay-constraint mobile applications is important, especially in highly-dynamic vehicular edge computing networks. The various movement patterns of vehicles lead to unbalanced offloading requirements and different load pressure for MEC servers. Self-Imitation Learning (SIL)-based Deep Reinforcement Learning (DRL) has emerged as a promising machine learning technique to break through obstacles in various research fields, especially in time-varying networks. In this paper, we first introduce related MEC technologies in vehicular networks. Then, we propose an energy-efficient approach for task scheduling in vehicular edge computing networks based on DRL, with the purpose of both guaranteeing the task latency requirement for multiple users and minimizing total energy consumption of MEC servers. Numerical results demonstrate that the proposed algorithm outperforms other methods.
AB - Mobile Edge Computing (MEC) is promising to alleviate the computation and storage burdens for terminals in wireless networks. The huge energy consumption of MEC servers challenges the establishment of smart cities and their service time powered by rechargeable batteries. In addition, Orthogonal Multiple Access (OMA) technique cannot utilize limited spectrum resources fully and efficiently. Therefore, Non-Orthogonal Multiple Access (NOMA)-based energy-efficient task scheduling among MEC servers for delay-constraint mobile applications is important, especially in highly-dynamic vehicular edge computing networks. The various movement patterns of vehicles lead to unbalanced offloading requirements and different load pressure for MEC servers. Self-Imitation Learning (SIL)-based Deep Reinforcement Learning (DRL) has emerged as a promising machine learning technique to break through obstacles in various research fields, especially in time-varying networks. In this paper, we first introduce related MEC technologies in vehicular networks. Then, we propose an energy-efficient approach for task scheduling in vehicular edge computing networks based on DRL, with the purpose of both guaranteeing the task latency requirement for multiple users and minimizing total energy consumption of MEC servers. Numerical results demonstrate that the proposed algorithm outperforms other methods.
KW - NOMA
KW - energy-efficient scheduling
KW - imitation learning
KW - vehicular edge computing
UR - http://www.scopus.com/inward/record.url?scp=85097229055&partnerID=8YFLogxK
U2 - 10.23919/JCC.2020.11.001
DO - 10.23919/JCC.2020.11.001
M3 - Article
AN - SCOPUS:85097229055
SN - 1673-5447
VL - 17
SP - 1
EP - 11
JO - China Communications
JF - China Communications
IS - 11
M1 - 9267792
ER -