Previous Articles     Next Articles

Access Control Method for EV Charging Stations Based on State Aggregation and Q-Learning

TANG Ziyu1, LUO Yonglong1, FANG Daohong2, ZHAO Chuanxin1   

  1. 1. School of Computer and Information, Anhui Normal University, Wuhu 241002, China;
    2. Electrical Engineering and Automation, Hefei University of Technology, Hefei 230009, China
  • Received:2021-05-12 Revised:2021-08-04 Online:2022-11-25 Published:2022-12-23
  • Supported by:
    This paper was supported by the National Natural Science Foundation of China under Grant Nos. 61871412, 61972439.

TANG Ziyu, LUO Yonglong, FANG Daohong, ZHAO Chuanxin. Access Control Method for EV Charging Stations Based on State Aggregation and Q-Learning[J]. Journal of Systems Science and Complexity, 2022, 35(6): 2145-2165.

This paper presents intelligent access control for a charging station and a framework for dynamically and adaptively managing charging requests from randomly arriving electric vehicles (EVs), to increase the revenue of the station. First, charging service requests from random EV arrivals are described as an event-driven sequential decision process, and the decision-making relies on an event-extended state that is composed of the real-time electricity price, real-time charging station state, and EV arrival event. Second, a state aggregation method is introduced to reduce the state space by first aggregating the charging station state in the form of the remaining charging time and then further aggregating it via sort coding. Besides, mathematical calculations of the code value are provided, and their uniqueness and continuous integer characteristics are proved. Then, a corresponding Q-learning method is proposed to derive an optimal or suboptimal access control policy. The results of a case study demonstrate that the proposed learning optimisation method based on the event-extended state aggregation performs better than flat Q-learning. The space complexity and time complexity are significantly reduced, which substantially improves the learning efficiency and optimisation performance.
[1] Patil H and Kalkhambka V N, Grid integration of electric vehicles for economic benefits: A Review, Journal of Modern Power Systems and Clean Energy, 2021, 9(1): 13–26.
[2] Tushar M H K, Zeineddine A W, and Assi C, Demand-side management by regulating charging and discharging of the EV, ESS, and utilizing renewable energy, IEEE Transactions on Industrial Informatics, 2018, 14(1): 117–126.
[3] Nour M, Chaves-Avila J P, Magdy G, et al., Review of positive and negative impacts of electric ′ vehicles charging on electric power systems, Energies, 2020, 13(18): 4675.
[4] Ding Z H, Lu Y, Lai K X, et al., Optimal coordinated operation scheduling for electric vehicle aggregator and charging stations in an integrated electricity-transportation system, International Journal of Electrical Power and Energy Systems, 2020, 121: 106040.
[5] Zhou C B, Qi S Z, Zhang J H, et al., Potential co-benefit effect analysis of orderly charging and discharging of electric vehicles in China, Energy, 2021, 226: 120352.
[6] Ye P, Li T Y, Sun F, et al., Research on ordered charging strategy of electric vehicle with double layer optimization method, IOP Conference Series Earth and Environmental Science, 2021, 645: 012019.
[7] Chen G, Mao Z L, Li J Y, et al., Multi-objective optimal planning of electric vehicle charging stations considering carbon emission, Automation of Electric Power Systems, 2014, 38(17): 49– 53, 136.
[8] Izadkhast S, Aggregation of Plug-in Electric Vehicles in Power Systems for Primary Frequency Control, Universidad Pontificia Comillas, Spain, 2017.
[9] Xu B W, Research on charging strategy optimization of electric vehicle based on AG, 5th International Conference on Vehicle, Mechanical and Electrical Engineering, Dalian, 2019.
[10] Schinke A and Hirsch H, Impact of electric vehicle charging and photovoltaic generation on distribution system voltage volatility, 2019 IEEE Power Energy and Society General Meeting (PESGM), Atlanta, 2019.
[11] Leterme W, Ruelens F, Claessens B, et al., A flexible stochastic optimization method for wind power balancing with PHEVs, IEEE Transactions on Smart Grid, 2014, 5(3): 1238–1245.
[12] Ghotge R, Snow Y, Farahani S, et al., Optimized scheduling of ev charging in solar parking lots for local peak reduction under EV demand uncertainty, Energies, 2020, 13(5): 1275.
[13] Diaz-Londono C, Colangelo L, Ruiz F, et al., Optimal strategy to exploit the flexibility of an electric vehicle charging station, Energies, 2019, 12(20): 3834.
[14] Kasturi K, Nayak C K, and Nayak M R, Electric vehicles management enabling G2V and V2G in smart distribution system for maximizing profits using MOMVO, International Transactions on Electrical Energy Systems, 2019, 29(6): e12013.
[15] Madina C, Zamora I, and Zabala E, Methodology for assessing electric vehicle charging infrastructure business models, Energy Policy, 2016, 89: 284–293.
[16] Wenzel G, Negrete-Pincetic M, Olivares D E, et al., Real-time charging strategies for an electric vehicle aggregator to provide ancillary services, IEEE Transactions on Smart Grid, 2018, 9(5): 5141–5151.
[17] Khemakhem S, Rekik M, and Krichen L, A flexible control strategy of plug-in electric vehicles operating in seven modes for smoothing load power curves in smart grid, Energy, 2017, 118(1): 197–208.
[18] Vuelvas J, Ruiz F, and Gruosso G, A time-of-use pricing strategy for managing electric vehicle clusters, Sustainable Energy, Grids and Networks, 2021, 25: 100411.
[19] Casini M, Vicino A, and Zanvettor G G, A receding horizon approach to peak power minimization for EV charging stations in the presence of uncertainty, International Journal of Electrical Power and Energy Systems, 2021, 126: 106567.
[20] Liu C and Murphey Y L, Optimal power management based on Q-learning and neuro-dynamic programming for plug-in hybrid electric vehicles, IEEE Transactions on Neural Network and Learning Systems, 2020, 31(6): 1942–1954.
[21] Lee Z J, Pang J Z F, and Low S H, Pricing EV charging service with demand charge, Electric Power Systems Research, 2020, 189: 106694.
[22] Sutton R S and Barto A G, Reinforcement Learning: An Introduction, Second Edition, MIT Press, Cambridge, 2018.
[23] Puterman M L, Markov Decision Processes: Discrete Stochastic Dynamic Programming, Wiley, New York, 1994.
[24] Vuelvas J, Ruiz F, and Gruosso G, Limiting gaming opportunities on incentive-based demand response programs, Applied Energy, 2018, 225: 668–681.
[25] Liu J J, Guo H Z, Xiong J Y, et al., Smart and resilient EV charging in SDN-enhanced vehicular edge computing networks, IEEE Journal on Selected Areas in Communications, 2020, 38(1): 217–228.
[26] Chen L X, Huang X L, and Zhang H, Modeling the charging behaviors for electric vehicles based on ternary symmetric kernel density estimation, Energies, 2020, 13(7): 1551.
[1] Youcheng LOU,Yiguang HONG,Guodong SHI. TARGET AGGREGATION OF SECOND-ORDER MULTI-AGENT SYSTEMS WITH SWITCHING INTERCONNECTION [J]. Journal of Systems Science and Complexity, 2012, 25(3): 430-440.
[2] Jianming ZHU ;Xiaodong HU . IMPROVED ALGORITHM FOR MINIMUM DATA AGGREGATION TIME PROBLEM IN WIRELESS SENSOR NETWORKS [J]. Journal of Systems Science and Complexity, 2008, 21(4): 626-636.
[3] Sheng CHEN;Lindu ZHAO;Ying HAN. MULTI-AGENT AGGREGATION BEHAVIOR ANALYSIS: THE DYNAMIC COMMUNICATION TOPOLOGY [J]. Journal of Systems Science and Complexity, 2008, 21(2): 209-216.
[4] Shan Cun LIU;Wan Hua QIU. ORDERED WEIGHTED AVERAGING AGGREGATION METHOD FOR PORTFOLIO SELECTION [J]. Journal of Systems Science and Complexity, 2004, 14(1): 109-116.
Viewed
Full text


Abstract