TY - JOUR
T1 - Entire route eco-driving method for electric bus based on rule-based reinforcement learning
AU - Yang, Lan
AU - Hu, Zhiqiang
AU - Wang, Liang
AU - Liu, Yang
AU - He, Jiangbo
AU - Qu, Xiaobo
AU - Zhao, Xiangmo
AU - Fang, Shan
N1 - Publisher Copyright:
© 2024 Elsevier Ltd
PY - 2024/9
Y1 - 2024/9
N2 - Electric bus (EB) has gradually become one of the main ways of transportation in cities due to the low energy consumption and low pollutant emissions. As battery endurance is easily affected by various factors such as external temperature, vehicle load, and driving habits, the anxiety for the endurance of EB has become a concern for researchers. To bridge the gap, an eco-driving method based on deep reinforcement learning (DRL) is proposed to achieve the entire route energy-saving. Firstly, the significant factors including the dynamic passenger load and air conditioner is considered for the energy consumption model of the EB. Secondly, a rule-based reinforcement learning algorithm is utilized for optimizing the driving speed and strategy, which can accelerate the convergence of the proposed model and improve the average reward of the reward function. Thirdly, by adjusting the reward function of reinforcement learning algorithm, three eco-driving modes of EB, namely efficiency priority mode, energy-efficiency balance mode and energy saving priority mode under various operational states are proposed. Finally, the results indicate that the efficiency priority mode achieves about an 8% increase in traffic efficiency and a reduction of approximately 20% in energy consumption compared to the baseline model. With the energy-efficiency balance mode, the model attains a 34.05% reduction in energy consumption with almost the same traffic efficiency. Under the energy saving priority mode, the proposed model exhibits a minor reduction in traffic efficiency within an acceptable limit but decreases energy consumption by 40.69%, achieving the optimization goals.
AB - Electric bus (EB) has gradually become one of the main ways of transportation in cities due to the low energy consumption and low pollutant emissions. As battery endurance is easily affected by various factors such as external temperature, vehicle load, and driving habits, the anxiety for the endurance of EB has become a concern for researchers. To bridge the gap, an eco-driving method based on deep reinforcement learning (DRL) is proposed to achieve the entire route energy-saving. Firstly, the significant factors including the dynamic passenger load and air conditioner is considered for the energy consumption model of the EB. Secondly, a rule-based reinforcement learning algorithm is utilized for optimizing the driving speed and strategy, which can accelerate the convergence of the proposed model and improve the average reward of the reward function. Thirdly, by adjusting the reward function of reinforcement learning algorithm, three eco-driving modes of EB, namely efficiency priority mode, energy-efficiency balance mode and energy saving priority mode under various operational states are proposed. Finally, the results indicate that the efficiency priority mode achieves about an 8% increase in traffic efficiency and a reduction of approximately 20% in energy consumption compared to the baseline model. With the energy-efficiency balance mode, the model attains a 34.05% reduction in energy consumption with almost the same traffic efficiency. Under the energy saving priority mode, the proposed model exhibits a minor reduction in traffic efficiency within an acceptable limit but decreases energy consumption by 40.69%, achieving the optimization goals.
KW - Deep reinforcement learning
KW - Eco-driving
KW - Electric bus
KW - Entire route speed optimization
UR - http://www.scopus.com/inward/record.url?scp=85198544654&partnerID=8YFLogxK
U2 - 10.1016/j.tre.2024.103636
DO - 10.1016/j.tre.2024.103636
M3 - Article
AN - SCOPUS:85198544654
SN - 1366-5545
VL - 189
JO - Transportation Research Part E: Logistics and Transportation Review
JF - Transportation Research Part E: Logistics and Transportation Review
M1 - 103636
ER -