Hydrogen-based electric vehicles such as Fuel Cell Electric Vehicles (FCHEVs) play an important role in producing zero carbon emissions and in reducing the pressure from the fuel economy crisis, simultaneously. This paper aims to address the energy management design for various performance metrics, such as power tracking and system accuracy, fuel cell lifetime, battery lifetime, and reduction of transient and peak current on Polymer Electrolyte Membrane Fuel Cell (PEMFC) and Li-ion batteries. The proposed algorithm includes a combination of reinforcement learning algorithms in low-level control loops and high-level supervisory control based on fuzzy logic load sharing, which is implemented in the system under consideration. More specifically, this research paper establishes a power system model with three DC-DC converters, which includes a hierarchical energy management framework employed in a two-layer control strategy. Three loop control strategies for hybrid electric vehicles based on reinforcement learning are designed in the low-level layer control strategy. The Deep Deterministic Policy Gradient with Twin Delayed (DDPG TD3) is used with a network. Three DRL controllers are designed using the hierarchical energy optimization control architecture. The comparative results between the two strategies, Deep Reinforcement Learning and Fuzzy logic supervisory control (DRL-F) and Super-Twisting algorithm and Fuzzy logic supervisory control (STW-F) under the EUDC driving cycle indicate that the proposed model DRL-F can ensure the Root Mean Square Error (RMSE) reduction for 21.05% compared to the STW-F and the Mean Error reduction for 8.31% compared to the STW-F method. The results demonstrate a more robust, accurate and precise system alongside uncertainties and disturbances in the Energy Management System (EMS) of FCHEV based on an advanced learning method.
Keywords: Deep reinforcement learning; Energy management; Fuel cell electric vehicle; Fuzzy control; PEM fuel cell; Ultracapacitor.
© 2024. The Author(s).