Abstract
Distributed energy resources (DER) in distribution systems, including renewable generation, micro-turbine, and energy storage, can be used to restore critical loads following extreme events to increase grid resiliency. However, properly coordinating multiple DERs in the system for multi-step restoration process under renewable uncertainty and fuel availability is a complicated sequential optimal control problem. Due to its capability to handle system non-linearity and uncertainty, reinforcement learning (RL) stands out as a potentially powerful candidate in solving complex sequential control problems. Moreover, the offline training of RL provides excellent action readiness during online operation, making it suitable to problems such as load restoration, where in-time, correct and coordinated actions are needed. In this study, a distribution system prioritized load restoration based on a simplified single-bus system is studied: with imperfect renewable generation forecast, the performance of an RL controller is compared with that of a deterministic model predictive control (MPC). Our experiment results show that the RL controller is able to learn from experience, adapt to the imperfect forecast information and provide a more reliable restoration process when compared with the baseline MPC controller.
Original language | American English |
---|---|
Number of pages | 9 |
State | Published - 2020 |
Event | IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (IEEE SmartGridComm) - Duration: 11 Nov 2020 → 13 Nov 2020 |
Conference
Conference | IEEE International Conference on Communications, Control, and Computing Technologies for Smart Grids (IEEE SmartGridComm) |
---|---|
Period | 11/11/20 → 13/11/20 |
Bibliographical note
See NREL/CP-2C00-79160 for paper as published in proceedingsNREL Publication Number
- NREL/CP-2C00-77116
Keywords
- grid resiliency
- load restoration
- micro grid
- reinforcement learning