Abstract
The performance of energy management strategies (EMSs) for hybrid electric vehicles (HEVs) is greatly impacted by the vehicles’ driving conditions. To handle the energy management problem in various unknown driving conditions, a real-time adaptive EMS is developed through a novel model-based reinforcement learning (RL) algorithm—Monte Carlo tree search (MCTS). First, a RL model is constructed to represent the HEV, which consists of a deterministic powertrain approximation model and a stochastic recursive Markov Chain. During online implementation, the model is continuously updated according to new conditions, guaranteeing its accuracy. Then, the MCTS algorithm is detailed. Different from traditional RL algorithms which learn a complete policy for a driving cycle, MCTS seeks to search the optimal action in real-time for each encountered HEV state. Combining the dynamic RL model and the real-time MCTS algorithm, the EMS can maintain satisfactory performance in various driving conditions with no prior information. In the simulation, the proposed strategy consumes 8.03%, 5.20%, and 4.34% less fuel compared with model-free Q-learning, model-based Q-learning, and model predictive control in a totally unknown cycle, and maintains the similar performance in other four cycles, which demonstrates its superior adaptability. Furthermore, experiments in a test bench also validate its effectiveness.
Original language | English |
---|---|
Pages (from-to) | 1 |
Number of pages | 1 |
Journal | IEEE Transactions on Transportation Electrification |
DOIs | |
Publication status | Accepted/In press - 2023 |
Keywords
- Adaptation models
- Computational modeling
- Energy management
- Fuels
- Hybrid electric vehicles
- Monte Carlo tree search
- Q-learning
- Real-time systems
- hybrid electric vehicle
- real-time energy management
- reinforcement learning