TY - JOUR
T1 - Uncertainty-aware Deep Reinforcement Learning for Trainable Equivalent Consumption Minimization Strategy of Fuel Cell Hybrid Electric Tracked Vehicle
AU - Su, Qicong
AU - Huang, Ruchen
AU - Zhang, Zhendong
AU - Shou, Yiwen
AU - He, Hongwen
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2025
Y1 - 2025
N2 - Energy management strategies (EMSs) are pivotal in optimizing energy efficiency for vehicles equipped with hybrid electric powertrains. Despite the growing adoption of deep reinforcement learning (DRL)-based approaches, challenges persist in achieving satisfactory optimization performance and maintaining reliable control. Motivated by this, this paper introduces a novel trainable equivalent consumption minimization strategy (ECMS) framework for fuel cell hybrid electric tracked vehicles (FCHETVs) with uncertainty-aware control. Firstly, the proposed framework employs a DRL algorithm to dynamically determine and optimize the equivalent factor in the ECMS method, facilitating improved fuel economy. Then, the soft actor-critic (SAC) algorithm is formulated for efficient policy learning. To further enhance control reliability, an ensembled policy network method is incorporated to measure uncertainty and mitigate suboptimal actions, thereby improving decision-making robustness. Simulation results reveal that the SAC-based trainable ECMS achieves significant fuel economy improvements, outperforming the traditional SAC and adaptive ECMS methods by 2.93% and 5.15% respectively. Moreover, the ensemble model ensures reliable and effective control, with online testing results indicating an additional 2.08% improvement in fuel economy. These findings underscore the effectiveness of integrating learning-based and optimization-based approaches in EMS design, offering a robust pathway to reducing energy consumption and promoting sustainable transportation solutions.
AB - Energy management strategies (EMSs) are pivotal in optimizing energy efficiency for vehicles equipped with hybrid electric powertrains. Despite the growing adoption of deep reinforcement learning (DRL)-based approaches, challenges persist in achieving satisfactory optimization performance and maintaining reliable control. Motivated by this, this paper introduces a novel trainable equivalent consumption minimization strategy (ECMS) framework for fuel cell hybrid electric tracked vehicles (FCHETVs) with uncertainty-aware control. Firstly, the proposed framework employs a DRL algorithm to dynamically determine and optimize the equivalent factor in the ECMS method, facilitating improved fuel economy. Then, the soft actor-critic (SAC) algorithm is formulated for efficient policy learning. To further enhance control reliability, an ensembled policy network method is incorporated to measure uncertainty and mitigate suboptimal actions, thereby improving decision-making robustness. Simulation results reveal that the SAC-based trainable ECMS achieves significant fuel economy improvements, outperforming the traditional SAC and adaptive ECMS methods by 2.93% and 5.15% respectively. Moreover, the ensemble model ensures reliable and effective control, with online testing results indicating an additional 2.08% improvement in fuel economy. These findings underscore the effectiveness of integrating learning-based and optimization-based approaches in EMS design, offering a robust pathway to reducing energy consumption and promoting sustainable transportation solutions.
KW - deep reinforcement learning
KW - Energy management
KW - ensembled policy network
KW - equivalent consumption minimization strategy
KW - soft actor-critic
UR - http://www.scopus.com/inward/record.url?scp=105003584021&partnerID=8YFLogxK
U2 - 10.1109/TTE.2025.3563203
DO - 10.1109/TTE.2025.3563203
M3 - Article
AN - SCOPUS:105003584021
SN - 2332-7782
JO - IEEE Transactions on Transportation Electrification
JF - IEEE Transactions on Transportation Electrification
ER -