In the Scale of Demands for Future AI Power, it’s worth watching for zombies

14
Jan 25
By | Other

A review of announced data center power requirements over the past year suggests that over 125,000 MW of new capacity has been required of utilities to meet data center requirements. Much of this demand is AI-related (although some is atypical data center operations, a small amount may serve future crypto workloads).

But while everyone (myself included) is hyperventilating a bit about the potential impacts on our energy grid, it’s important to step back and realize where we are right now, which is probably in the middle of another classic noise cycle. We’ve seen these before, the biggest being the Dot.com era in the early 90’s (raise your hand if you were excited after I was able to buy dog ​​food online from a sock puppet. BTW , it now resides in the Henry Ford Museum).

Yes, 25% of Virginia’s workload already serves data centers. And yes, AI will have many uses, some of which we can only imagine today (many of which will make the grid much more efficient – a post for the near future), and it will need more electricity to do so. However, there may be some real limits to how much power these AI-driven data centers will consume, resulting in actual supply numbers coming in well below some of the predictions we hear today. Among other developments, there are likely to be significant gains in chips and cooling efficiency (liquid cooling is making great strides) that conspire to lower power requirements from what they would otherwise be. Some of this technology is so new that it’s hard to know how it will ultimately turn out.

Ask Jeeves

Right now, numerous AI companies are racing to achieve pretty much the same thing: market dominance. There will undoubtedly be companies that will not make it in this highly competitive global race, which will eventually be dominated by a few deep-pocketed participants, with some of today’s players going out of business or being absorbed by competitors bigger ones. This can be analogous to the race for search engine supremacy in the 90s. After all, we were left with only one or two – with very deep pockets – who mattered after the rest failed. In the end, most AI companies are likely to consolidate, and if you don’t believe that statement, you can go to Ask Jeeves.

Connection process

However, the purpose of this exercise here is to focus specifically on the data center connection capacity figures that are being announced by utilities across the country. There’s a good chance that these headline numbers will greatly inflate what will actually be built, due to how the process of connecting to the service actually works.

To better understand this issue, let’s look at Dominion Energy’s five-step interconnection process for large loads. Since Dominion has lived in this world for a while (Northern Virginia is the world’s premier data center hub), they are further along in this than most utilities. These steps include:

1) The company performs a high-level assessment of the customer’s electricity needs.

2) Customers who choose to proceed then sign an Engineering Letter of Authorization, committing to pay the utility for detailed studies that analyze optimal locations for substations and the required infrastructure that may need to be developed.

3) Customers who wish to proceed then sign a Construction Authorization Letter that obligates the applicant to pay for all costs associated with the project, regardless of whether the project is ultimately demolished.

4) The Dominion then begins to develop the necessary infrastructure.

5) The entire utility is governed by an Electric Service Agreement (ESA) that sets out the terms for how the customer will receive power and the payments they will make in exchange for that service (the ESA is structured to ensure that the customer covers the services of the enterprise whether or not it receives power at the specified levels).

AEP Ohio, which is pulling heavy load largely because of its 765-kilovolt super-sized transmission lines, is also working hard to get ahead of the game. Last May, the utility serviced 600 MW of data centers and at that time had an additional 4,400 MW of demand covered by ESAs or Letters of Agreement. Since that date, an additional 30,000 MW of potential new loads had also knocked on AEP’s door (a figure that is likely to be even greater today).

In the case of AEP Ohio, the utility has stated that it simply cannot serve that much additional load without undertaking major new investments in transmission equipment. As a result, it has proposed a new rate structure that would require data centers larger than 25 MW to pay for a minimum of 85% of their electricity requirements for up to 12 years, even if they terminate using less energy than predicted. Data companies must also demonstrate financial viability and waive exit fees if projects are canceled or scaled back. The enterprise’s goal here is to minimize the potential for orphaned infrastructure that other customers will have to pay for later (in industry jargon, these are called “stranded assets”).

Watch out for Zombies

The reality is that only a fraction of the initial interconnect requirements will be built. Although it is impossible to gain insight into what exactly is happening at any given moment in time (in this highly competitive global arena, these companies play their cards very close to their chest), many of the data companies likely do what you are me we would do if we had a mandate to get as much electricity as quickly as possible: We would submit multiple applications to multiple utilities, hoping that at least some of the applications would hurt the payment. Then, once those projects looked like they had a chance to bear fruit, we’d take other chips off the table and ditch the “zombie demands” (a useful and colorful phrase used by Brian Janous – former VP of Energy at Microsoft and co-founder of Cloverleaf Infrastructure).

Looking for supply of correlation queues for analogy

There is an analog on the supply side of the service industry that may be instructive in this regard. Today in the US there are over ten thousand generating projects in transmission interconnection queues, waiting to be connected to the electric grid before they can be commissioned and deliver power. In fact, the capacity of the remaining queuing supply assets is approximately twice that of the existing generation fleet. However, recent history shows that less than 20% of the supply-side projects in the queue are actually built.

As enterprises further tighten their demand-side interconnection requirements, implementing more formal procedures and imposing fees that require significant upfront financial commitments from the data center crowd, we should expect to see a decrease in zombie applications.

It’s clear that AI has real value for society, and we’re starting to see some use cases emerge. It is also clear that we are still in the early days of this dynamic, with rapidly evolving technologies and business models and many unanswered questions. However, getting through the current ad cycle will take some time. We won’t know the full implications until we start seeing some projects continue while others are cancelled. If you don’t believe this statement, you can go ask Perplexity.AI. In answer to the question, “How many of the data center AI services interconnect requirements will actually be built?” he replied, “several factors suggest that only a fraction of the proposed projects are likely to be completed.” Taken straight from the horse’s mouth…

Click any of the icons to share this post:

 

Categories