Publication Type : Journal Article
Thematic Areas : Center for Computational Engineering and Networking (CEN)
Source : IEEE Access
Url : https://ieeexplore.ieee.org/document/10522634
Campus : Coimbatore
School : School of Artificial Intelligence - Coimbatore
Year : 2024
Abstract : A charging station that integrates renewable energy sources is a promising solution to address the increasing demand for electric vehicle charging without expanding the distribution network. An efficient and flexible energy management strategy is essential for effectively integrating various energy sources and EVs. This research work aims to develop an Energy Management System for an EV charging station that minimizes the operating cost of the EVCS operator while meeting the energy demands of connected EVs. The proposed approach employs a model-free method leveraging Deep Reinforcement Learning to identify optimal schedules of connected EVs in real time. A Markov Decision Process model is constructed from the perspective of the EVCS operator. The real-world scenarios are formulated by considering the stochastic nature of renewable energy and the commuting behavior of EVs. Various DRL algorithms for addressing MDPs are examined, and their performances are empirically compared. Notably, the Truncated Quantile Critics algorithm emerges as the superior choice, yielding enhanced model performance. The simulation findings show that the proposed EMS can offer an enhanced control strategy, reducing the charging cost for EVCS operators compared to other benchmark methods.
Cite this Research Publication : Dr. Rahul Satheesh (G. S. Asha Rani, P. S. Lal Priya, J. Jayan, R. Satheesh and M. L. Kolhe, "Data-Driven Energy Management of an Electric Vehicle Charging Station Using Deep Reinforcement Learning," in IEEE Access, vol. 12, pp. 65956-65966, 2024, doi: 10.1109/ACCESS.2024.3398059)