Abstract
As one of adaptive optimal controls, the Q-learning based supervisory control for hybrid electric vehicle (HEV) energy management is rarely studied for its adaptability. In real-world driving scenarios, conditions such as vehicle loads, road conditions and traffic conditions may vary. If these changes occur and the vehicle supervisory control does not adapt to it, the resulting fuel economy may not be optimal. To our best knowledge, for the first time, the study investigates the adaptability of Q-learning based supervisory control for HEVs. A comprehensive analysis is presented for the adaptability interpretation with three varying factors: driving cycle, vehicle load condition, and road grade. A parallel HEV architecture is considered and Q-learning is used as the reinforcement learning algorithm to control the torque split between the engine and the electric motor. Model Predictive Control, Equivalent consumption minimization strategy and thermostatic control strategy are implemented for comparison. The Q-learning based supervisory control shows strong adaptability under different conditions, and it leads the fuel economy among four supervisory controls in all three varying conditions.
Original language | English |
---|---|
Pages (from-to) | 6797-6806 |
Number of pages | 10 |
Journal | IEEE Transactions on Intelligent Transportation Systems |
Volume | 23 |
Issue number | 7 |
DOIs | |
State | Published - Jul 1 2022 |
Bibliographical note
Publisher Copyright:© 2000-2011 IEEE.
Funding
This work was supported in part by the National Natural Science Foundation of China under Grant 51875054 and Grant 51705044 and in part by the Chongqing Natural Science Foundation for Distinguished Young Scholars, Chongqing Science and Technology Bureau, China, under Grant cstc2019jcyjjq0010.
Funders | Funder number |
---|---|
Chongqing Natural Science Foundation for Distinguished Young Scholars | |
Daqing Science and Technology Bureau | cstc2019jcyjjq0010 |
National Natural Science Foundation of China (NSFC) | 51705044, 51875054 |
Keywords
- Q-learning
- Reinforcement learning
- hybrid electric vehicle
- real-time implementation
- supervisory control
ASJC Scopus subject areas
- Automotive Engineering
- Mechanical Engineering
- Computer Science Applications