The AI boom in electricity demand: a multiply recycled myth

by John Q on November 30, 2024

I posted this piece in RenewEconomy a couple of months ago. It didn’t convince the commenters then, and I don’t expect it to be any different here, but I’m putting it on the record anyway.

AI won’t use as much electricity as we are told, and it’s not a reason to slow transition to renewables

The recent rise of “generative AI” models has led to a lot of dire predictions about the associated requirements for energy. It has been estimated that AI will consume anything from 9 to 25 per cent of all US electricity by 2032.

But we have been here before. Predictions of this kind have been made ever since the emergence of the Internet as a central part of modern life, often tied to claims and counterclaims about the transition to renewable energy.

Back in 1999, Forbes magazine ran a piece headlined, Dig more coal — the PCs are coming. This article claimed that personal computers would use 50 per cent of US electricity within a decade. The unsubtle implication was that any attempt to reduce carbon dioxide emissions was doomed to failure

Of course, this prediction wasn’t borne out. Computing power has increased a thousand-fold since the turn of the century. But far from demanding more electricity personal computers have become more efficient with laptops mostly replacing large standalone boxes, and software improvements reducing waste.

A typical home computer now consumes around 30-60 watts when it is operating, less than a bar fridge or an incandescent light bulb.

The rise of large data centres and cloud computing produced another round of alarm. A US EPA report in 2007 predicted a doubling of demand every five years. Again, this number fed into a range of debates about renewable energy and climate change.

Yet throughout this period, the actual share of electricity use accounted for by the IT sector has hovered between 1 and 2 per cent, accounting for less than 1 per cent of global greenhouse gas emissions. By contrast, the unglamorous and largely disregarded business of making cement accounts for around 7 per cent of global emissions.

Will generative AI change this pattern? Not for quite a while. Although most business organizations now use AI for some purposes, it typically accounts for only 5 to 10 per cent of IT budgets.

Even if that share doubled or tripled the impact would be barely noticeable. Looking the other side of the market, OpenAI, the maker of ChatGPT, is bringing in around $3 billion a year in sales revenue, and has spent around $7 billion developing its model. Even if every penny of that was spent on electricity, the effect would be little more than a blip.

Of course, AI is growing rapidly. A tenfold increase in expenditure by 2030 isn’t out of the question. But that would only double total the total use of electricity in IT.

And, as in the past, this growth will be offset by continued increases in efficiency. Most of the increase could be fully offset if the world put an end to the incredible waste of electricity on cryptocurrency mining (currently 0.5 to 1 per cent of total world electricity consumption, and not normally counted in estimates of IT use).

If predictions of massive electricity use by the IT sector have been so consistently wrong for decades, why do they keep being made, and believed?

The simplest explanation, epitomised by the Forbes article from 1999, is that coal and gas producers want to claim that there is a continuing demand for their products, one that can’t be met by solar PV and wind. That explanation is certainly relevant today, as gas producers in particular seize on projections of growing demand to justify new plants.

At the other end of the policy spectrum, advocates of “degrowth” don’t want to concede that the explosive growth of the information economy is sustainable, unlike the industrial economy of the 20th century. The suggestion that electricity demand from AI will overwhelm attempts to decarbonise electricity supply supports the conclusion that we need to stop and reverse growth in all sectors of the economy.

Next there is the general free-floating concern about everything to with computers, which are both vitally necessary and mysterious to most of us. The rise of AI has heightened those concerns. But whereas no one can tell whether an AI apocalypse is on the way, or what it would entail, an electricity crisis is a much more comprehensible danger.

And finally, people just love a good story. The Y2K panic, supposedly based on the shortening of digits in dates used in computers, was obviously false (if it had been true, we would have seen widespread failures well before 1 January 2000).

But the appeal of the story was irresistible, at least in the English-speaking world, and billions of dollars were spent on problems that could have been dealt with using a “fix on failure” approach.

For what it’s worth, it seems likely that the AI boom is already reaching a plateau, and highly likely that such a plateau will be reached sooner or later. But when and if this happens, it won’t be because we have run out of electricity to feed the machines.

Update

The AI boom is also being used to justify talk, yet again, of a nuclear renaissance. All the big tech firms have made announcements of one kind or another about seeking nuclear power to run their data centres. And its true that the “always on” character of nuclear makes it a genuine example of the (otherwise mostly spurious) notion of “baseload demand”. But when you look at what Google, Meta and the others are actually doing, it amounts to around 1 GW apiece, the output of a single standard-sized reactor. That might bring a few retired reactors, like the one at Three Mile Island, back on line, but it’s unlikely to induce big new investments.

{ 0 comments… add one now }

Leave a Comment

You can use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>