Will AI Reverse the Clean Energy Transition? It's Complicated.
Cleantech Insights | cleantechwriting.com
Cleantech Insights — June 6, 2025
Hi there,
Hope you're having a great week. This week I’m looking at data centers and electricity growth.
I specialize in translating technical complexity into strategic communications for energy innovators, coalitions, consultants and others. Need help articulating complex energy and tech topics? Drop me a line.
While the fate of the reconciliation bill is probably the biggest cleantech story of the year, the rapid growth of electricity use — driven primarily by AI and data centers — may end up being the most important energy story of the decade.
Boiling it down: if forecasts are correct, the AI revolution is going to drive staggering increases in electricity consumption over the next two decades. To meet this need, utilities will have to add huge amounts of new capacity, including from fossil fuels. Climate advocates are worried that this trend threatens to halt or reverse the progress of the clean energy transition that has been underway over the last few years.
But that’s not the whole story. A growing number of analysts are questioning whether or not AI-driven load growth will materialize at the scale some are claiming. At the very least, these analysts say, projections are highly variable and uncertain. And getting the numbers wrong is not without consequences.
Projections of Rapid Growth
Over the last few decades, in spite of economic growth, electricity use in the United States has been relatively flat. That has changed in the last couple of years. According to the consulting firm ICF, electrification of buildings and transportation, increased manufacturing, and the growth of crypto have all contributed to recent growth. But it's the potential growth of AI that has really caught people's attention.
Last September, ICF issued a report projecting that the electrical loads in the United States would grow by 9% by 2028 and 18% by 2033. Just last month, it updated those numbers, suggesting that demand could grow 25% by 2030 and 78% by 2050. In these scenarios, residential electricity rates could also rise from anywhere between 15 and 40% by the end of the decade — and double by 2050.
Other studies have reached similar conclusions. In a recent piece, data energy journalist Michael Thomas pointed to a Lawrence Berkeley National Labs (LBNL) study suggesting that by 2028, data centers could consume between 6.7% an 12% of the entire country’s electricity; around the same time the consulting group Grid Strategies released a study suggesting that demand could grow by 16% by 2029.
The LBNL report specifically caught Thomas’s attention because one of the authors — energy analyst Jonathan Koomey — is well-known for being skeptical of AI-driven power demand.
How to Meet Demand?
Ironically, in spite of the interest in power data centers with natural gas, clean energy and demand management are the fastest way to meet near-term demand growth. As the ICF report says:
Demand-side management programs—such as programs that promote energy-efficient appliances and rooftop solar, as well as other load management strategies like virtual power plants—are attractive because they offset the need for spending on new generation, transmission, and distribution infrastructure at a much lower cost. They are also much quicker to deploy than new utility-scale generation and, therefore, could help manage early demand growth challenges while there is still a high level of uncertainty in shifting demand projections.
However, in the longer term (past 2030), many forecasters are suggesting a move back toward fossil fuels, particularly gas, to meet data center demand needs.
In other words, after years of talking about "energy transition", we would be facing a kind of reverse transition to more carbon-intensive sources of energy to meet the demands of crypto and AI.
Will Data Center Growth Actually Materialize? It's Complicated
On the other hand, Michael Thomas, Amory Lovins, and other analysts have also pointed to considerable uncertainty around AI forecasts as a reason to tread cautiously.
Part of the problem with all of these projections is that no one really knows for sure just how much data center growth will actually materialize.
A recent Canary Media article focused on just a few of these uncertainties:
Utilities don’t know which data centers will actually get built.
Forecasts may contain a number of duplicate proposals, as developers approach a number of utilities at the same time, but plan to build only one data center.
AI could quickly become much more energy-efficient. A few months ago, Chinese firm DeepSeek upended the industry by announcing that it had “replicated the performance of leading U.S.-based AI systems at a fraction of the cost and energy consumption.”
Other skeptics have been harsher. Amory Lovins recently wrote a report in which he accused AI boosters of ignoring myriad uncertainties:
Future electricity needs for artificial intelligence (AI) are wildly uncertain—shaped by unproven concepts, disputed performance, limited trust, volatile markets, unpredictable adoption, and technical efficiency that quadruples roughly each year. Yet a speculative surge is driving massive investment in data centers and new electricity supplies, risking a 12-figure overbuild. Avoiding an electricity bubble requires clear-eyed analysis, disciplined planning, and using markets to allocate risks fairly to potential beneficiaries.
To give just one example, Lovins points out that “forecasters rarely know when AI models will pivot from energy-intensive training mode to far lighter, sporadic, but very protracted inference mode.”
As Michael Thomas points out:
In 2007, energy modelers at the Energy Information Administration (EIA) forecasted that electricity demand would continue to grow much as it did in the decades prior. Their model predicted that the United States would consume 4,700 terawatt hours (TWh) by the year 2023. But they were off by 838 TWh—more electricity than the United Kingdom and France consumed last year. Instead of growing, electricity demand flatlined for much of the next decade and a half.
Finally, there is a more fundamental question of whether or not AI will have long-term use cases that are profitable.
The Risks of Overbuilding to Meet Speculative Demand
While there are risks in failing to prepare for future demand growth, overbuilding has consequences as well. If utilities overbuild, there’s a good chance residential ratepayers will be left holding the bag.
As Lovins says:
Anyone building a thermal power plant paying back in 20–30 years, to run an AI data center whose energy use per inference may well drop ~75% per year, had better have a solid alternative market; otherwise it’s a bet on sustaining decades of rapid exponential growth in AI services.
Possible Solutions
Fortunately, there appear to be short-term solutions that could fill the gap as the demand picture gets clearer.
Some data centers are looking at behind-the-meter solutions, in which they generate their own power on-site.
Analytics companies like Gridcare say they can find as much as 100GW of slack in the current system.
The Rocky Mountain Institute (RMI) proposes a co-location "power couples” model, which pairs large consumers with new solar, wind and battery near an existing generator with an approved interconnection.
Ultimately, the smart approach may be to hedge our bets — leveraging near-term solutions like demand management and grid optimization and — at least until we get a clearer picture — avoiding risky long-term commitments to fossil fuel infrastructure that could become stranded assets if demand fails to materialize as expected.