Green AI: Insights Into Deep Learning's Looming Energy Efficiency Crisis

Research output: NRELPresentation

Abstract

As demands grow to integrate artificial intelligence into every aspect of industry, commerce, and life, deep learning's exploding energy cost has become a looming crisis, making AI systems a salient energy-efficiency challenge. One might expect that doubling a neural network's size would halve its error rate, or at least allow it to achieve greater performance given the same amount of time and energy. I will present clear and substantial scientific evidence which indicates that not only is this intuition wildly wrong, but that neural networks scale so poorly that to increase deep learning performance by only a small fraction can easily require an order of magnitude or more increase in computational resources and energy. Further, the marginal trade-off price of to increase model performance rapidly explodes as performance targets are increased. To address this challenge, I will provide a toolkit of techniques that can be applied today to mitigate the inefficiency of modern deep learning. And, I will conclude by illuminating a practical path forward towards efficient, Green AI.
Original languageAmerican English
Number of pages15
StatePublished - 2022

Publication series

NamePresented at the AI and Electric Power Summit, 4-6 October 2022, Rome, Italy

NREL Publication Number

  • NREL/PR-2C00-84088

Keywords

  • artificial intelligence
  • butter
  • deep learning
  • efficient
  • empirical
  • energy efficiency
  • Green AI
  • green computing
  • machine learning

Fingerprint

Dive into the research topics of 'Green AI: Insights Into Deep Learning's Looming Energy Efficiency Crisis'. Together they form a unique fingerprint.

Cite this