Is AI data center disastrous electric demand hyped?

By Kennedy Maize

Early last month, AP reported, Microsoft Corp. announced it has put a billion-dollar data center project in Ohio on hold, including turning over two of three sites in Licking County outside of Columbus to agricultural use.

In December, Microsoft pulled the plug on future phases of a $3 billion data center in Wisconsin, Wisconsin Public Radio reported. A company spokesperson said in an email, “We have paused early construction work for this second phase while we evaluate scope and recent changes in technology and consider how this might impact the design of our facilities.”

Ironically, the site for Microsoft’s Wisconsin project is where President Trump in his first term claimed he was bringing a $10 billion Foxconn plant that would make Apple iPhones. The “deal” failed, as so many of his deals have crashed over the years.

When Microsoft took over the site and began construction, President Biden showed up. He said that Trump “came here with your senator, Ron Johnson, literally holding a golden shovel, promising to build the eighth wonder of the world. You kidding me? Look what happened. They dug a hole with those golden shovels, and then they fell into it.”

Also last month, according to NBC News, Well Fargo analysts said Amazon is scaling back its commitments to data center leases, including several in Ohio. Amazon and Microsoft are the largest providers of cloud computing services.

The rush to develop data centers to support advanced computing threatened to overwhelm the U.S. electric grid. Perhaps the most credible projection came last December from the Department of Energy’s Lawrence Berkeley Laboratory’s “2024 United States Data Center Usage Report.” The LBL report found that “the electricity consumption of U.S. data centers is currently growing at an accelerating rate.” The analysis found “a compound annual growth rate of approximately 7% from 2014 to 2018, increasing to 18% between 2018 and 2023, and then ranging from 13% to 27% between 2023 and 2028.”

A more hyperbolic picture came from the Environmental and Energy Study Institute (EESI): “Data centers’ projected electricity demand in 2030 is set to increase to up to 130 GW (or 1,050 TWh), which would represent close to 12% of total U.S. annual demand.”

Much hand-wringing has resulted. Rolling brownouts and blackouts, new transmission lines appearing everywhere. Also, EESI noted, “Building new fossil-fuel plants to fulfill this demand will increase carbon emissions and further contribute to climate change.”

There was also much soto voce glee on the part of electric utilities that have cost-based rates. Shades of the nuclear power boom of the early 1970s. The nuclear heavy breathing was built on assumptions of annual electric demand growth of 7% annually into the next century. The projections were based on short-term growth rates following the end of World War II. Those projections were failed to materialize. One major impact was the end of the nuclear boom. Until the recent Vogtle fiasco, no U.S. nuclear plant ordered after 1974 got built.

Historic U.S. electric demand growth
Source: Statista.com

Could the dramatic increase in electric demand overwhelming the grid turn out to be less than meets the eye? No definitive answer is available but there are some indications that the demand for data centers is manageable and not an impending crisis for the grid and society.

First, there may be a problem of “phantom data centers,” developers queuing up for grid connections for projects that won’t get built. LatitudeMedia reported in March, “As companies race to secure their place in the booming data center market, they’re flooding power providers with speculative, or ‘phantom,’ load requests — many of which will never materialize.”

Next, the handwringing over AI demand may be greatly exaggerated. Futurist Michael Barnard, writing in Medium, headlined a recent article, “Crying Wolf: The AI Energy Hysteria”, with a subhead: “From Dot-Com to Blockchain to ChatGPT, We’ve Heard This All Before.” His key argument is that “optimization” will come into play, making the software much less energy intensive.

He writes, “When I saw rising concerns about energy demand from large language models and generative AI…I knew that optimization would be occurring shortly and that actual energy demand requirements would end up being much lower than the hyperbolic concerns.” An example is China’s development of “Deep Seek,” an AI program that is much more energy efficient than the U.S. offerings and also better.

Finally, why connect data centers to the grid at all? That’s the case that Houston-based Advocates for Consumer Related Electricity is making. They ask, “What if we simply allow large, sophisticated buyers like data center companies to enter unregulated electricity utility arrangements with the suppliers of their choice? Why do we need a regulator to protect large businesses from their decisions, provided their decisions don’t impact the existing regulated grid?”

The Quad Report: to subscribe, for back issues, and a searchable archive.

To comment: