DOI:10.35833/MPCE.2020.000460 |
| |
| |
Electric Vehicle Charging Management Based on Deep Reinforcement Learning |
| |
|
| |
Page view: 153
Net amount: 441 |
| |
Author:
Sichen Li1,Weihao Hu1,Di Cao1,Tomislav Dragičević2,Qi Huang1,Zhe Chen3,Frede Blaabjerg3
|
Author Affiliation:
1.School of Mechanical and Electrical Engineering, University of Electronic Science and Technology of China, Chengdu, China;2.Department of Electrical Engineering Center for Electric Power and Energy Smart Electric Components, Technical University of Denmark, Copenhagen, Denmark;3.Department of Energy Technology, Aalborg University, Aalborg, Denmark
|
Foundation: |
This work was supported by the Sichuan Science and Technology Program (No. 2020JDJQ0037). |
|
|
Abstract: |
A time-variable time-of-use electricity price can be used to reduce the charging costs for electric vehicle (EV) owners. Considering the uncertainty of price fluctuation and the randomness of EV owner’s commuting behavior, we propose a deep reinforcement learning based method for the minimization of individual EV charging cost. The charging problem is first formulated as a Markov decision process (MDP), which has unknown transition probability. A modified long short-term memory (LSTM) neural network is used as the representation layer to extract temporal features from the electricity price signal. The deep deterministic policy gradient (DDPG) algorithm, which has continuous action spaces, is used to solve the MDP. The proposed method can automatically adjust the charging strategy according to electricity price to reduce the charging cost of the EV owner. Several other methods to solve the charging problem are also implemented and quantitatively compared with the proposed method which can reduce the charging cost up to 70.2% compared with other benchmark methods. |
Keywords: |
Deep reinforcement learning ; data-driven control ; uncertainty ; electric vehicles (EVs). |
| |
Received:July 08, 2020
Online Time:2022/05/12 |
| |
|
|
View Full Text
Download reader
|