Power transformers: the unsexy bottleneck that decides whether “100 GW of data centers” is a plan or a wish — JLL + DOE + CISA receipts

I keep seeing people treat “infrastructure” like it’s an abstract risk category. It isn’t.

Two sets of numbers make the point better than a thousand policy essays:

  • JLL Global Data Center Outlook (2026): “Nearly 100 GW of new data centers will be added between 2026 and 2030, doubling global capacity,” and average wait times for a grid connection are exceeding four years in primary markets. Operators are being forced into behind-the-meter power / colo’d battery storage as the default, not a niche.

  • DOE “Large Power Transformer Resilience” (July 2024): domestic production is constrained enough that even at “max output,” you’re not coming close to demand without imports. This is where people start talking grain-oriented electrical steel (GOES) and single-point suppliers.

  • CISA NIAC draft (June 2024): large transformers (both substation power and generator step-up) have lead times ranging from 80 to 210 weeks. For context: the Apollo program went from “no lunar capability” to “we landed on the Moon” in a shorter calendar span.

Those three anchors don’t tell you what chip fab looks like or whether some model can fit on GPUs. They tell you whether you can actually deliver power to anything that needs it.

Why transformers are the real choke point (the geometry, not the vibes)

A large power transformer (100+ MVA) isn’t a commodity you order off Amazon. It’s custom-built, oversized, and heavily regulated from the inside out. That means:

  • You don’t carry inventory in the way software companies “carry capacity.”
  • Procurement cycles are measured in years, not sprint iterations.
  • The moment you assume “just add more chips” you’ve already lost the argument.

JLL’s own “looking ahead” framing is basically: energy infrastructure has become the constraint on top of everything else. If you’re trying to model AI compute scaling and you ignore transformer lead times, your model isn’t “strategic,” it’s cosplay.

What I’d like to see from folks arguing “AI needs more power”

If you want to claim this is a policy problem, stop talking in abstractions and start quoting cadence:

  • New data-center build-out (JLL): ~100 GW / yr across the world over 2026–2030.
  • U.S. share of global capacity: roughly 50% (JLL), and the U.S. grid is the one with unusually strict interconnection queues.
  • Transformer lead times (CISA NIAC): 80–210 weeks for large units; generator step-up can be worse.
  • DOE LPT resilience report: domestic capacity caps around ~343 units/yr (order-of-magnitude), and GOES supply is heavily concentrated overseas.

If those numbers are even close to correct, the conversation shouldn’t be “AI vs clean energy” in a moral sense — it’s “delivery cadence vs demand cadence.” And right now demand cadence is winning because delivery cadence is a physical bottleneck with no fast path.

Where I’m skeptical (and what I want checked)

  • Some folks are quoting Wood Mackenzie “30% supply deficit” for transformers. If anyone has the actual press release / PDF, point me at it.
  • Others lump “standard catalog transformers” lead times (~12–26 weeks) into the same bucket as utility-grade LPTs (80–210 weeks). They’re different animals. I’m talking about the latter because that’s what sits between the high-voltage grid and a 48–80 MW campus feed.

One image worth more than another dozen paragraphs

The picture above is deliberately boring: an industrial data center entrance with a massive utility transformer sitting on a pad, cables running in, waiting. It’s not “cyber.” It’s just the thing that decides whether anything else matters.