TY - JOUR
T1 - A comparative study of deep reinforcement learning based energy management strategy for hybrid electric vehicle
AU - Wang, Zexing
AU - He, Hongwen
AU - Peng, Jiankun
AU - Chen, Weiqi
AU - Wu, Changcheng
AU - Fan, Yi
AU - Zhou, Jiaxuan
N1 - Publisher Copyright:
© 2023
PY - 2023/10/1
Y1 - 2023/10/1
N2 - Energy management strategies (EMSs) are essential for hybrid electric vehicles (HEVs), as they can further exploit the potential of HEVs to save energy and reduce emissions. Research on deep reinforcement learning (DRL)-based EMSs is developing rapidly. However, most studies have ignored the impact of uniform test benchmarks on the performance of DRL-based EMS and focus too much on fuel economy improvement resulting in a single optimization objective. In this study, four DRL-based EMSs are designed for HEVs with a multi-objective optimization reward function that considers battery health furtherly. The optimal learning rates and weight coefficients of the four EMSs are determined first. Based on this, the monetary cost, fuel cost, and battery health of each EMS are intensively studied under nine driving cycles. The EMSs perform better in high-speed conditions and worse in suburban conditions are initially concluded. A comparative analysis under unlearned mixed driving cycles validates this conclusion and shows that the SAC-based EMS achieves a fuel consumption of 4.218L per 100 km and 99.96 % battery health, which are the lowest of the four EMSs. This paper can provide a theoretical basis for the parametric and driving cycle study of DRL-based EMSs.
AB - Energy management strategies (EMSs) are essential for hybrid electric vehicles (HEVs), as they can further exploit the potential of HEVs to save energy and reduce emissions. Research on deep reinforcement learning (DRL)-based EMSs is developing rapidly. However, most studies have ignored the impact of uniform test benchmarks on the performance of DRL-based EMS and focus too much on fuel economy improvement resulting in a single optimization objective. In this study, four DRL-based EMSs are designed for HEVs with a multi-objective optimization reward function that considers battery health furtherly. The optimal learning rates and weight coefficients of the four EMSs are determined first. Based on this, the monetary cost, fuel cost, and battery health of each EMS are intensively studied under nine driving cycles. The EMSs perform better in high-speed conditions and worse in suburban conditions are initially concluded. A comparative analysis under unlearned mixed driving cycles validates this conclusion and shows that the SAC-based EMS achieves a fuel consumption of 4.218L per 100 km and 99.96 % battery health, which are the lowest of the four EMSs. This paper can provide a theoretical basis for the parametric and driving cycle study of DRL-based EMSs.
KW - Deep reinforcement learning
KW - Energy management
KW - Hybrid electric vehicle
KW - Power battery health
UR - http://www.scopus.com/inward/record.url?scp=85165990875&partnerID=8YFLogxK
U2 - 10.1016/j.enconman.2023.117442
DO - 10.1016/j.enconman.2023.117442
M3 - Article
AN - SCOPUS:85165990875
SN - 0196-8904
VL - 293
JO - Energy Conversion and Management
JF - Energy Conversion and Management
M1 - 117442
ER -