News

Going analog may tame AI’s exploding energy needs

Paul van Gerven
Reading time: 3 minutes

IBM came up with an analog AI chip that’s 14 times more energy efficient than the best digital hardware available right now.

Artificial intelligence is booming, but so is its carbon footprint. Training of the large-language model ChatGPT 3 is estimated to have cost about 1300 megawatthours, enough to drive 750,000 kilometers in an electric car. Google has revealed that AI accounts for 10-15 percent of the company’s energy use. And OpenAI has published data showing that the computing power for key AI landmarks has doubled every 3.4 months over the past few years.

In a world that’s increasingly threatened by climate change, this cannot continue. So far, innovations that increase the energy efficiency of AI have not resulted in curbing the technology’s carbon footprint. That’s why IBM researchers suggest a more radical approach: ditching the popular GPU in favor of analog technology. Their prototype analog chip described in this week’s edition of Nature runs an AI speech recognition model well over 10 times more efficiently than existing hardware.

This article is exclusively available to premium members of Bits&Chips. Already a premium member? Please log in. Not yet a premium member? Become one for only €15 and enjoy all the benefits.

Login

Related content