TY - JOUR
T1 - A Double-Deep Q-Network-Based Energy Management Strategy for Hybrid Electric Vehicles under Variable Driving Cycles
AU - Zhang, Jiaqi
AU - Jiao, Xiaohong
AU - Yang, Chao
N1 - Publisher Copyright:
© 2021 Wiley-VCH GmbH
PY - 2021/2
Y1 - 2021/2
N2 - As a core part of hybrid electric vehicles (HEVs), energy management strategy (EMS) directly affects the vehicle fuel-saving performance by regulating energy flow between engine and battery. Currently, most studies on EMS are focused on buses or commuter private cars, whose driving cycles are relatively fixed. However, there is also a great demand for the EMS that adapts to variable driving cycles. The rise of machine learning, especially deep learning and reinforcement learning, provides a new opportunity for the design of EMS for HEVs. Motivated by this issue, herein, a double-deep Q-network (DDQN)-based EMS for HEVs under variable driving cycles is proposed. The distance traveled of the driving cycle is creatively introduced as states into the DDQN-based EMS of HEV. The relevant problem of “curse of dimensionality” caused by choosing too many states in the process of training is solved via the good generalization of deep neural network. For the problem of overestimation in model training, two different neural networks are designed for action selection and target value calculation, respectively. The effectiveness and adaptability to variable driving cycles of the proposed DDQN-based EMS are verified by simulation comparison with Q-learning-based EMS and rule-based EMS for improving fuel economy.
AB - As a core part of hybrid electric vehicles (HEVs), energy management strategy (EMS) directly affects the vehicle fuel-saving performance by regulating energy flow between engine and battery. Currently, most studies on EMS are focused on buses or commuter private cars, whose driving cycles are relatively fixed. However, there is also a great demand for the EMS that adapts to variable driving cycles. The rise of machine learning, especially deep learning and reinforcement learning, provides a new opportunity for the design of EMS for HEVs. Motivated by this issue, herein, a double-deep Q-network (DDQN)-based EMS for HEVs under variable driving cycles is proposed. The distance traveled of the driving cycle is creatively introduced as states into the DDQN-based EMS of HEV. The relevant problem of “curse of dimensionality” caused by choosing too many states in the process of training is solved via the good generalization of deep neural network. For the problem of overestimation in model training, two different neural networks are designed for action selection and target value calculation, respectively. The effectiveness and adaptability to variable driving cycles of the proposed DDQN-based EMS are verified by simulation comparison with Q-learning-based EMS and rule-based EMS for improving fuel economy.
KW - adaptabilities
KW - double-deep Q-networks
KW - energy management strategies
KW - hybrid electric vehicles
KW - variable driving cycles
UR - http://www.scopus.com/inward/record.url?scp=85099033700&partnerID=8YFLogxK
U2 - 10.1002/ente.202000770
DO - 10.1002/ente.202000770
M3 - Article
AN - SCOPUS:85099033700
SN - 2194-4288
VL - 9
JO - Energy Technology
JF - Energy Technology
IS - 2
M1 - 2000770
ER -