摘要
The rapid advancements in cloud computing, Big Data and their related applications have led to a skyrocketing increase in data center energy consumption year by year. The prior approaches for improving data center energy efficiency mostly suffer from high system dynamics or the complexity of data centers. In this paper, we propose an optimization framework based on deep reinforcement learning, named DeepEE, to jointly optimize energy consumption from the perspectives of task scheduling and cooling control. In DeepEE, a PArameterized action space based Deep Q-Network (PADQN) algorithm is proposed to tackle the hybrid action space problem. Then, a dynamic time factor mechanism for adjusting cooling control interval is introduced into PADQN (PADQN-D) to achieve more accurate and efficient coordination of IT and cooling subsystems. Finally, in order to train and evaluate the proposed algorithms safely and quickly, a simulation platform is built to model the dynamics of IT and cooling subsystems. Extensive real-trace based experiments illustrate that: 1) the proposed PADQN algorithm can save up to 15% and 10% energy consumption compared with the baseline siloed and joint optimization approaches respectively; 2) the proposed PADQN-D algorithm with dynamic cooling control interval can better adapt to the change of IT workload; 3) our proposed algorithms achieve more stable performance gain in terms of power consumption by adopting the parameterized action space.
源语言 | 英语 |
---|---|
页(从-至) | 1310-1323 |
页数 | 14 |
期刊 | IEEE Transactions on Services Computing |
卷 | 16 |
期 | 2 |
DOI | |
出版状态 | 已出版 - 1 3月 2023 |