Explainable Artificial Intelligence (XAI) techniques for energy and power systems: Review, challenges and opportunities

R. Machlev, L. Heistrene, M. Perl, K. Y. Levy, J. Belikov, S. Mannor, Y. Levron

Research output: Contribution to journalReview articlepeer-review

Abstract

Despite widespread adoption and outstanding performance, machine learning models are considered as “black boxes”, since it is very difficult to understand how such models operate in practice. Therefore, in the power systems field, which requires a high level of accountability, it is hard for experts to trust and justify decisions and recommendations made by these models. Meanwhile, in the last couple of years, Explainable Artificial Intelligence (XAI) techniques have been developed to improve the explainability of machine learning models, such that their output can be better understood. In this light, it is the purpose of this paper to highlight the potential of using XAI for power system applications. We first present the common challenges of using XAI in such applications and then review and analyze the recent works on this topic, and the on-going trends in the research community. We hope that this paper will trigger fruitful discussions and encourage further research on this important emerging topic.

Original languageEnglish
Article number100169
JournalEnergy and AI
Volume9
DOIs
StatePublished - Aug 2022

Keywords

  • Deep-learning
  • Energy
  • Explainable artificial intelligence
  • Neural network
  • Power
  • XAI

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • General Energy
  • Engineering (miscellaneous)

Fingerprint

Dive into the research topics of 'Explainable Artificial Intelligence (XAI) techniques for energy and power systems: Review, challenges and opportunities'. Together they form a unique fingerprint.

Cite this