Is “obligation to serve” an obstacle to fair electric markets?

For decades, a governing business paradigm for electric utility companies has been the “duty to serve,” or “obligation to serve.” This, distilled, is a moral and legal requirement to make no distinction among customers seeking new electric services. Is it outdated?

A 2008 paper by a Michigan Public Service Commission staffer described the dictum: “A public utility is not free to choose to serve only those customers which it is convenient or currently profitable to serve.” It derives from English common law and from early 20th century U.S. “common carrier” laws related to railroads, pipelines, and telephone service.

The projected increases in electricity demand driven by power-hungry data centers are causing some concern whether the obligation to serve still makes sense. A recent Power Engineering article noted the possibility of massive new power demands: “Georgia Power projects that over the next decade the state will be leading the nation’s second industrial revolution, led by artificial intelligence boosting data centers, which could triple the state’s energy consumption.”

Severin Borenstein

A recent blog post by energy economist Severin Borenstein, head of the University of California, Berkeley’s Energy Institute at Haas, says, “If we don’t rethink the paradigm for new large electricity customers, they could end up burdening existing ratepayers.

Borenstein notes, “Around the country, developers of large data centers – known as hyperscalers – are approaching electric utilities with plans to build facilities that can draw 1 GW of power or more and connect them to the grid.” What obligations do utilities have to serve these new, big loads, particularly with a grid that is already struggling to serve existing customers?

The duty to serve, he says, “may at first seem like a fair burden in exchange for being a monopolist that supplies an essential service while being assured of a regulated rate of return on their investment. But the doctrine has always been problematic economically, because ‘non-discriminatory’ has been used to argue that customers who actually impose different costs on the system – whether due to different locations, demand profiles, or predictability of future needs – should be charged similar rates.”

The potential for hyperscalers reveals the problem, says Borenstein, “because a single facility can dwarf all other demand growth that a utility faces, and cancellation of such a facility can completely change a utility’s demand outlook.”

The possible energy implications of the data needs of artificial intelligence programs and, the last next big thing, cryptocurrency data mining, are on the minds of the developers and the electricity providers. Few answers are clear. Companies such as Google, Amazon, Microsoft, and Meta are looking to line up their own dedicated elect power supplies while supporting goals of not producing additional greenhouse gases. Most of the emphasis so far has been on nuclear power.

Often, these focus on new, speculative and dedicated nuclear plants, such as Google’s deal with Kairos Power for the as-yet only on paper Hermes reactors. Meta, Facebook’s daddy, announced this week that it is seeking proposals from nuclear power developers to help meet its artificial intelligence and environment goals. Meta wants to add 1 to 4 gigawatts of new U.S. nuclear generation capacity starting in the early 2030s, it said in a release. A typical U.S. nuclear plant has a capacity of about 1 gigawatt.

Among the other nuclear options power buyers are pursuing are restarting shuttered nuclear plants, such as Palisades, Three Mile Island, and Duane Arnold, and co-locating loads at currently-operating plants, such as the failed deal between Amazon and Talen Energy for the Susquehanna plant in the PJM Interconnection.

The Federal Energy Regulatory Commission’s rejection of the Amazon-Talen deal raised some of the concerns that are likely to arise in many of these developments, highlighting some of the issues Berkeley’s Borenstein raises. By locating its data center on the power plant’s site, the project would bypass the PJM grid.

Two major utilities with a large PJM footprint, Exelon and AEP, fought the Amazon plan. They argued “that the PJM capacity markets will suffer as capacity resources exit to serve load that uses and benefits from, but does not pay for, the system and that replacement capacity will take years to develop.”

While PJM supported the deal, the PJM Market Monitor raised objections. The Market Monitor said the deal “would provide unique and special treatment for a specific type of load and a specific type of power plant and would set a precedent for significant changes to the PJM markets that will impose costs on other market participants.” The Market Monitor added that “the core benefit to the Co-Located Load is avoiding state and federal regulation and the associated costs, such as paying distribution charges and transmission charges.”

Writing in Utility Dive, Mike Granowski of the Roland Berger management consultancy observed, “To the utility system, data centers appear as large industrial loads. However, when the data center steps in front of an existing central generating station for exclusive service of its energy demand, problems can ensue.”

He added, “When the central generation serving the data center is out of service, the data center still needs energy and will seek it from the broader utility system as a backup. This concentrated demand from data centers introduces challenges to grid stability and repurposes the transmission assets that were formerly meant to serve customers. These assets are still being paid for by the wide spectrum of utility customers, raising concerns about potential inequities where certain captive customer groups could shoulder asset costs that are now primarily benefiting the high-load data center.”

–Kennedy Maize

The Quad Report

To comment:

kenmaize@gmail.com