Abstract
In this paper, we propose a curriculum learned reinforcement learning (RL) controller to facilitate distribution system critical load restoration (CLR), leveraging RL's fast online response and its outstanding optimal sequential control capability. Like many grid control problems, CLR is complicated due to the large control action space and renewable uncertainty in a heavily constrained non-linear environment with strong intertemporal dependency. The nature of the problem oftentimes causes the RL policy to converge to a poor-performing local optimum if learned directly. To overcome this, we design a two-stage curriculum in which the RL agent will learn generation control and load restoration decision under different scenarios progressively. Via curriculum learning, the trained RL controller is expected to achieve a better control performance, with critical loads restored as rapidly and reliably as possible. Using the IEEE 13-bus test system, we illustrate the performance of the RL controller trained by the proposed curriculum-based method.
Original language | American English |
---|---|
Number of pages | 5 |
DOIs | |
State | Published - 2021 |
Event | 2021 IEEE Power and Energy Society General Meeting, PESGM 2021 - Washington, United States Duration: 26 Jul 2021 → 29 Jul 2021 |
Conference
Conference | 2021 IEEE Power and Energy Society General Meeting, PESGM 2021 |
---|---|
Country/Territory | United States |
City | Washington |
Period | 26/07/21 → 29/07/21 |
Bibliographical note
See NREL/CP-2C00-78351 for preprintNREL Publication Number
- NREL/CP-2C00-82309
Keywords
- curriculum learning
- grid resiliency
- load restoration
- microgrid
- reinforcement learning