fbpx Skip to main content
search
AOS PodcastCSRDigitizationRisk Management

The Surging Problem of AI Energy Consumption

By May 9, 2024No Comments

When I prepare each episode of Art of Supply, I intentionally dig deep. Usually, along the way, I find a surprise or two… But this topic was full of surprises. 

Perhaps the biggest surprise of all is what I learned about one of the most exhausted topics of the last two years: AI. Everyone wants to talk about AI. But the fact that AI may be the greatest unrecognized threat to the environment today is something we don’t talk about enough. 

Love it or fear it, AI is an energy hog.

AI Power Demand May Grow by 10X

On April 9th, Rene Haas, CEO at British semiconductor and software design company Arm Holdings, made a statement about data center energy consumption ahead of a partnership announcement with U.S. and Japan-based universities. 

As Haas said, “by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less.”

25 percent of all power consumed in the United States might go to data processing in less than 6 years. No wonder all of the interest in AI and advanced computing has been driving up the stock prices of companies that own power plants: Vistra is up by 84 percent and Constellation Energy by 63 percent.

Many of the sources I consulted in preparation for this episode reference a report from The International Energy Agency. Titled simply Electricity 2024: Analysis and Forecast to 2026, this 170 page report is full of data points, analysis, and projections.

For instance, the report states that a request to ChatGPT (one of the most popular examples of generative AI widely available today) requires an average 2.9 watt-hours of electricity. That is equivalent to turning on a 60-watt light bulb for about three minutes. That is nearly 10 times as much energy as the average Google search. 

If that doesn’t have your mind spinning, AI power demand is expected to grow by at least 10X between 2023 and 2026.

Companies that are heavily invested in the AI and data processing space are well aware of this problem. Microsoft and Google have well defined plans to achieve net negative emissions (forget net zero). Apple aspires to be net neutral globally, including their supply chains, by 2030. 

How they are all going to hit those targets without changing something about AI energy consumption is a mystery to me.

AI’s Insatiable Need for Energy

AI runs on GPUs (short for (graphics processing units), a type of chip used to process large amounts of data. Processing requirements and energy consumption increase when the AI is responding to a query. The more complex the model or the larger the dataset, the more energy must be consumed to complete the job.

In addition, queries involving imagery are more energy intensive than those focused on text. Generating one image using AI can use the same amount of energy as charging a smartphone according to researchers at Hugging Face, a collaborative AI platform. 

Energy isn’t just consumed when we use AI; it is also consumed when the AI is being trained. 

Alex de Vries, a data scientist and a Ph.D. candidate at Vrije University Amsterdam talks about a training phase v an inference phase. Training is the process of setting up the model and teaching it how to learn on its own, while inference is when you feed it scenarios to test it and refine how it works.

ChatGPT it took relatively little energy to train, but a lot to do inference, which makes sense, because it had to learn to do a lot of complex things in a very human-friendly way. 

ChatGPT-3 was the one cited as consuming about 10 times as much energy per query as a Google search. GPT-4 probably uses more power because it has more parameters and is a larger model.

The Battle Still to Come

Part of what makes AI’s energy requirements such a concern is sheer demand. ChatGPT is processing somewhere in the neighborhood of two hundred million requests per day. To do that, it consumes more than half a million kilowatt-hours of electricity daily.

As the New Yorker explained, the average U.S. household consumes twenty-nine kilowatt-hours a day. Don’t reach for your calculator – I’ve already done the math. ChatGPT – just one generative AI available – is currently using over 17 thousand private home equivalents per day.

But AI isn’t the only high tech energy hog we have to deal with, data centers are an issue too. According to the same International Energy Agency report referenced earlier, data centers account for about 1 to 1.5 percent of global electricity use

The data center industry is responsible for somewhere between 2 and 3 percent of greenhouse gasses, thanks to those GPUs. GPU chips require 10–15 times as much energy as a traditional CPU because they use more transistors. All of those servers running makes a lot of heat which has to be cooled to protect the machines. That requires energy for cir conditioning, but also water – another critical resource.

Shaolei Ren, a professor of electrical and computer engineering at the University of California Riverside, told the Wall Street Journal that “ChatGPT-3 needs to “drink” a 500-milliliter bottle of water for a basic conversation of 20 to 50 inquiries.” And GPT-4, as well as any releases that follow it, will likely need more. Possibly a lot more.

I’m not overly afraid of a takeover by robot overlords, but if there had been one, I would have expected it to be more like the movie Terminator. If you’ve seen a video of Boston Dynamics’ latest humanoid robot standing up from a prostrate position without assistance, it makes you stop and think.

But maybe that’s just a distraction. Maybe the real risk – or threat? – will be associated with the battle between humans and machines over energy.

Links:

Close Menu