Abstract
Modernization of energy systems has led to in- creased interactions among multiple critical infrastructures and diverse stakeholders making the challenge of operational decision making more complex and at times beyond cognitive capabilities of human operators. The state-of-the-art machine learning and deep learning approaches show promise of supporting users with complex decision-making challenges, such as those occurring in our rapidly transforming cyber-physical energy systems. However, successful adoption of data-driven decision support technology for critical infrastructure will be dependent on the ability of these technologies to be trustworthy and contextually interpretable. In this paper, we investigate the feasibility of implementing XAI for interpretable detection of cyberattacks in the energy system. Leveraging a proof-of-concept simulation use case of detection of a data falsification attack on a photovoltaic system using XGBoost algorithm, we demonstrate how Local Interpretable Model-Agnostic Explanations (LIME), a flavor XAI approach, can help provide contextual and actionable interpretation of cyberattack detection.
Original language | American English |
---|---|
Number of pages | 8 |
State | Published - 2024 |
Event | The 2024 Conference on Innovative Smart Grid Technologies, North America (ISGT NA 2024) - Washington DC Duration: 19 Feb 2024 → 22 Feb 2024 |
Conference
Conference | The 2024 Conference on Innovative Smart Grid Technologies, North America (ISGT NA 2024) |
---|---|
City | Washington DC |
Period | 19/02/24 → 22/02/24 |
NREL Publication Number
- NREL/CP-5R00-86988
Keywords
- artificial intelligence
- cybersecurity
- energy system
- energy system security
- events and anomaly detection
- explainable artificial intelligence