Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework

Guodong Du, Yuan Zou*, Xudong Zhang*, Lingxiong Guo, Ningyuan Guo

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

67 Citations (Scopus)

Abstract

A novel deep reinforcement learning (DRL) control framework for the energy management strategy of the series hybrid electric tracked vehicle (SHETV) is proposed in this paper. Firstly, the powertrain model of the vehicle is established, and the formulation of the energy management problem is given. Then, an efficient deep reinforcement learning framework based on the double deep Q-learning (DDQL) algorithm is built for the optimal problem solving, which also contains a modified prioritized experience replay (MPER) and an adaptive optimization method of network weights called AMSGrad. The proposed framework is verified by the realistic driving cycle, then is compared to the dynamic programming (DP) method and the previous deep reinforcement learning method. Simulation results show that the newly constructed deep reinforcement learning framework achieves higher training efficiency and lower energy consumption than the previous deep reinforcement learning method does, and the fuel economy is proved to approach the global optimality. Besides, its adaptability and robustness are validated by different driving schedules.

Original languageEnglish
Article number122523
JournalEnergy
Volume241
DOIs
Publication statusPublished - 15 Feb 2022

Keywords

  • Adaptive optimization method
  • Double deep Q-learning algorithm
  • Energy management control
  • Modified prioritized experience replay
  • Series hybrid electric vehicle

Fingerprint

Dive into the research topics of 'Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework'. Together they form a unique fingerprint.

Cite this