Deep reinforcement learning-based energy management strategy for fuel cell buses integrating future road information and cabin comfort control

Chunchun Jia, Wei Liu, Hongwen He, K. T. Chau*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

25 Citations (Scopus)

Abstract

Conventional energy management strategy (EMS) for fuel cell vehicles (FCVs) aims to optimize powertrain energy consumption while ignoring the air conditioning regulation, whereby the overall energy efficiency cannot be optimal. To enhance the cabin-powertrain holistic energy utilization without compromising energy storage system degradation and passenger temperature comfort, this paper proposes a novel energy management paradigm. The comprehensive control of cabin comfort and fuel cell/battery durability is achieved by comprehensively utilizing onboard sensors and vehicle-cloud infrastructure. Specifically, the vehicle energy- and thermal-coupled control problem is formulated by considering energy consumption, component ageing, and cabin's dynamic thermal model. In addition to regular state space in energy management problems, future road information and environmental temperature are innovatively integrated into the energy management framework. A twin delayed deep deterministic policy gradient algorithm is used to solve the problem to enhance the overall energy efficiency. Simulation results indicate that, compared with rule-based EMSs, the proposed strategy achieves cabin comfort while extending the battery life by at least 3.79 % and reducing the overall vehicle operating cost by at least 2.71 %.

Original languageEnglish
Article number119032
JournalEnergy Conversion and Management
Volume321
DOIs
Publication statusPublished - 1 Dec 2024

Keywords

  • Cabin comfort control
  • Deep reinforcement learning
  • Energy management strategy
  • Fuel cell bus
  • Multi-source information fusion

Cite this