TY - JOUR
T1 - An efficient intelligent energy management strategy based on deep reinforcement learning for hybrid electric flying car
AU - Yang, Chao
AU - Lu, Zhexi
AU - Wang, Weida
AU - Wang, Muyao
AU - Zhao, Jing
N1 - Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/10/1
Y1 - 2023/10/1
N2 - Hybrid electric flying cars hold clear potential to support high mobility and environmentally friendly transportation. For hybrid electric flying cars, overall performance and efficiency highly depend on the coordination of the electrical and fuel systems under ground and air dual-mode. However, the huge differences in the scale and fluctuation characteristics of energy demand between ground driving and air flight modes make the efficient control of energy flow more complex. Thus, designing a power coordinated control strategy for hybrid electric flying cars is a challenging technical problem. This paper proposed a deep reinforcement learning-based energy management strategy (EMS) for a series hybrid electric flying car. A mathematical model of the series hybrid electric flying car driven by the distributed hybrid electric propulsion system (HEPS) which mainly consists of battery packs, twin turboshaft engine and generator sets (TGSs), 16 rotor-motors, and 4 wheel-motors is established. Subsequently, a Double Deep Q Network (DDQN)-based EMS considering ground and air dual driving mode is proposed. A simplified method for the number of control variables is designed to improve exploration efficiency and accelerate the convergence speed. In addition, the frequent engine on/off problem is also taken into account. Finally, DDQN-based and dynamic programming (DP)-based EMSs are applied to investigate the power flow distribution for two completely different hypothetical driving scenarios, namely search and rescue (SAR) scenarios and urban air mobility (UAM) scenarios. The results demonstrate the effectiveness of the DDQN-based EMS and its capacity of reducing the computation time.
AB - Hybrid electric flying cars hold clear potential to support high mobility and environmentally friendly transportation. For hybrid electric flying cars, overall performance and efficiency highly depend on the coordination of the electrical and fuel systems under ground and air dual-mode. However, the huge differences in the scale and fluctuation characteristics of energy demand between ground driving and air flight modes make the efficient control of energy flow more complex. Thus, designing a power coordinated control strategy for hybrid electric flying cars is a challenging technical problem. This paper proposed a deep reinforcement learning-based energy management strategy (EMS) for a series hybrid electric flying car. A mathematical model of the series hybrid electric flying car driven by the distributed hybrid electric propulsion system (HEPS) which mainly consists of battery packs, twin turboshaft engine and generator sets (TGSs), 16 rotor-motors, and 4 wheel-motors is established. Subsequently, a Double Deep Q Network (DDQN)-based EMS considering ground and air dual driving mode is proposed. A simplified method for the number of control variables is designed to improve exploration efficiency and accelerate the convergence speed. In addition, the frequent engine on/off problem is also taken into account. Finally, DDQN-based and dynamic programming (DP)-based EMSs are applied to investigate the power flow distribution for two completely different hypothetical driving scenarios, namely search and rescue (SAR) scenarios and urban air mobility (UAM) scenarios. The results demonstrate the effectiveness of the DDQN-based EMS and its capacity of reducing the computation time.
KW - Double deep Q network
KW - Energy management strategy
KW - Flying cars
KW - Ground and air dual driving mode
KW - Hybrid electric propulsion system
UR - http://www.scopus.com/inward/record.url?scp=85163389957&partnerID=8YFLogxK
U2 - 10.1016/j.energy.2023.128118
DO - 10.1016/j.energy.2023.128118
M3 - Article
AN - SCOPUS:85163389957
SN - 0360-5442
VL - 280
JO - Energy
JF - Energy
M1 - 128118
ER -