TY - JOUR
T1 - Reinforcement learning-based real-time intelligent energy management for hybrid electric vehicles in a model predictive control framework
AU - Yang, Ningkang
AU - Ruan, Shumin
AU - Han, Lijin
AU - Liu, Hui
AU - Guo, Lingxiong
AU - Xiang, Changle
N1 - Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/5/1
Y1 - 2023/5/1
N2 - —This paper proposes a real-time energy management strategy (EMS) for hybrid electric vehicles by incorporating reinforcement learning (RL) in a model predictive control (MPC) framework, which avoids the inherent drawbacks of RL—the excessive learning time and lack of adaptability—and remarkably enhances the real-time performance of MPC. First, the MPC framework for the energy management problem is formulated. In that, a novel long short-term memory (LSTM) neural network is utilized to construct the velocity predictor for a more accurate prediction, and its prediction capability is verified by a comparative analysis. Then, the HEV prediction model and the velocity predictor are regarded as the RL model with which the RL agent can interact. On this basis, the optimal control sequence in the prediction horizon can be learned through model-based RL, but only the first element is actually executed, and the RL process begins anew after the prediction horizon moves forward. In the simulation, the algorithm's convergence is analyzed and the influence of the prediction horizon length is evaluated. Then, the proposed EMS is compared with DP, conventional MPC, and RL method, the results of which demonstrate its performance and adaptability. As last, a hardware-in-the-loop test validates its actual applicability.
AB - —This paper proposes a real-time energy management strategy (EMS) for hybrid electric vehicles by incorporating reinforcement learning (RL) in a model predictive control (MPC) framework, which avoids the inherent drawbacks of RL—the excessive learning time and lack of adaptability—and remarkably enhances the real-time performance of MPC. First, the MPC framework for the energy management problem is formulated. In that, a novel long short-term memory (LSTM) neural network is utilized to construct the velocity predictor for a more accurate prediction, and its prediction capability is verified by a comparative analysis. Then, the HEV prediction model and the velocity predictor are regarded as the RL model with which the RL agent can interact. On this basis, the optimal control sequence in the prediction horizon can be learned through model-based RL, but only the first element is actually executed, and the RL process begins anew after the prediction horizon moves forward. In the simulation, the algorithm's convergence is analyzed and the influence of the prediction horizon length is evaluated. Then, the proposed EMS is compared with DP, conventional MPC, and RL method, the results of which demonstrate its performance and adaptability. As last, a hardware-in-the-loop test validates its actual applicability.
KW - Hybrid electric vehicle
KW - Long short-term memory network
KW - Model predictive control
KW - Model-based reinforcement learning
KW - Real-time energy management
UR - http://www.scopus.com/inward/record.url?scp=85148596037&partnerID=8YFLogxK
U2 - 10.1016/j.energy.2023.126971
DO - 10.1016/j.energy.2023.126971
M3 - Article
AN - SCOPUS:85148596037
SN - 0360-5442
VL - 270
JO - Energy
JF - Energy
M1 - 126971
ER -