Reinforcement Learning of Adaptive Energy Management with Transition Probability for a Hybrid Electric Tracked Vehicle

Teng Liu, Yuan Zou, Dexing Liu, Fengchun Sun

Research output: Contribution to journalArticlepeer-review

220 Citations (Scopus)

Abstract

A reinforcement learning-based adaptive energy management (RLAEM) is proposed for a hybrid electric tracked vehicle (HETV) in this paper. A control oriented model of the HETV is first established, in which the state-of-charge (SOC) of battery and the speed of generator are the state variables, and the engine's torque is the control variable. Subsequently, a transition probability matrix is learned from a specific driving schedule of the HETV. The proposed RLAEM decides appropriate power split between the battery and engine-generator set (EGS) to minimize the fuel consumption over different driving schedules. With the RLAEM, not only is driver's power requirement guaranteed, but also the fuel economy is improved as well. Finally, the RLAEM is compared with the stochastic dynamic programming (SDP)-based energy management for different driving schedules. The simulation results demonstrate the adaptability, optimality, and learning ability of the RLAEM and its capacity of reducing the computation time.

Original languageEnglish
Article number7234919
Pages (from-to)7837-7846
Number of pages10
JournalIEEE Transactions on Industrial Electronics
Volume62
Issue number12
DOIs
Publication statusPublished - 1 Dec 2015

Keywords

  • Adaptability
  • Q-learning algorithm
  • energy management
  • hybrid electric tracked vehicle
  • programming (SDP)
  • state of charge (SOC)
  • stochastic dynamic

Fingerprint

Dive into the research topics of 'Reinforcement Learning of Adaptive Energy Management with Transition Probability for a Hybrid Electric Tracked Vehicle'. Together they form a unique fingerprint.

Cite this