Abstract
Deep reinforcement learning-based energy management strategies (EMSs) often suffer from reduced reliability and lack of interpretability under unfamiliar driving conditions, which can negatively impact optimization performance. To address this issue, taking fuel cell vehicles as the research object, an energy management method based on policy reliability assessment is proposed. An offline training framework that integrates intelligent learning with model-based optimization is developed, where the soft actor-critic (SAC) algorithm is employed to dynamically determine the optimal equivalent factor for the equivalent consumption minimization strategy (ECMS). A policy reliability assessment mechanism is designed, utilizing an ensemble policy network model to quantitatively evaluate the reliability of EMS decisions. Besides, based on the predefined reliability thresholds, the equivalent factor is corrected online to ensure the interpretability of safety-critical action. The experimental simulation results show that compared to standard SAC and conventional adaptive ECMS, the proposed SAC-ECMS training framework improves fuel economy by 4.32% and 7.82%, respectively. Under unfamiliar test cycles, the reliability assessment mechanism further enhances fuel economy by 2.87% while demonstrating real-time computational performance.
| Translated title of the contribution | Research on DRL-ECMS Energy Management Method for Fuel Cell Vehicle Based on Policy Reliability Assessment |
|---|---|
| Original language | Chinese (Traditional) |
| Pages (from-to) | 127-136 |
| Number of pages | 10 |
| Journal | Qiche Gongcheng/Automotive Engineering |
| Volume | 48 |
| Issue number | 1 |
| DOIs | |
| Publication status | Published - 25 Jan 2026 |
Fingerprint
Dive into the research topics of 'Research on DRL-ECMS Energy Management Method for Fuel Cell Vehicle Based on Policy Reliability Assessment'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver