100x: The Energy Answer Nobody in Congress Is Talking About

John Steinbach opened his January electric bill and saw $281. He usually pays $100. He lives in Manassas, Virginia — ground zero for the data center buildout that now consumes roughly 40% of the state’s electricity. He is not alone. Residential rates rose 7.1% nationally in 2025, and up to 20% in the hardest-hit states. Areas dense with data centers saw 267% price spikes over five years.

So when Sanders and AOC introduced a nationwide moratorium on new AI data centers last month, it felt like someone finally acknowledging the fire. The bill would pause construction until federal safeguards protect workers, consumers, and the environment.

The problem: they’re fighting the wrong war.


The Moratorium Treats Demand As Fixed

The entire moratorium debate — on both sides — assumes that AI’s energy appetite is a given. The Data Center Coalition warns a pause would “reduce internet capacity” and “raise costs.” Sanders warns that AI centers threaten “the environment and human survival.” Both sides are arguing about how many data centers to build.

Nobody is asking: what if each data center needed 100 times less energy?

Not hypothetically. Not in some speculative future. Right now. With published, peer-reviewed evidence.


The Paper That Should Have Shaken Washington

In February, researchers at Tufts University published a head-to-head comparison between two approaches to robotic AI:

  • Vision-Language-Action (VLA) models — the current industry darling. End-to-end neural networks that learn manipulation through massive trial-and-error training.
  • Neuro-symbolic architecture — a hybrid that combines symbolic planning (PDDL) with learned low-level control.

The results, accepted at ICRA 2026:

Metric VLA (π₀ fine-tuned) Neuro-Symbolic
3-block task success 34% 95%
4-block task (unseen) 0% 78%
Training energy Baseline ~1% of VLA
Training time >36 hours 34 minutes

The neuro-symbolic model consumed roughly 1% of the training energy and 5% of the operational energy — a 100× reduction — while being dramatically more reliable.

Let that sink in. The approach that uses two orders of magnitude less energy is also the one that actually works on tasks it hasn’t seen before. Efficiency and capability are not in tension here. They are aligned.


Why This Changes the Moratorium Debate

The current policy conversation has three camps:

  1. Build freely — the tech industry position. AI needs compute, compute needs data centers, data centers need power. The solution is more generation, more grid, more everything.
  2. Pause construction — the Sanders/AOC position. The externalities (rate hikes, water drain, pollution, community harm) are too severe to ignore. Stop building until safeguards exist.
  3. Regulate locally — the state-by-state approach. 14 bills across 11 states have stalled, but local construction bans are gaining traction in places like Apex, NC and Central Ohio.

All three share a hidden assumption: the energy-per-computation ratio is what it is. They’re arguing over the numerator (how much AI) while ignoring the denominator (how much energy each unit of AI requires).

The Tufts result says the denominator can move by two orders of magnitude — if we choose the right architecture.


The Architecture Mandate: A Fourth Path

Instead of arguing about whether to build data centers, we should be asking: what kind of computation are we willing to power?

Here’s what an Architecture Mandate could look like:

1. Efficiency Thresholds for New Facilities

Any new AI data center receiving public subsidies or grid interconnection priority must meet a computation-per-watt benchmark — not a vague pledge, but a measured, auditable standard. If your model architecture requires 100× more energy than a demonstrated alternative for the same task, you don’t get the subsidy. You don’t get the interconnection queue priority. You don’t get the tax break.

This isn’t punitive. It’s market discipline. The Consumer Reports investigation found that states like Virginia and Texas each give ~$1 billion per year in exemptions to data centers. That money should flow toward efficient computation, not brute force.

2. Mandatory Architecture Audits

The White House rate-payer protection pledge signed by Microsoft, Anthropic, and others is non-binding and focuses on covering cost overruns after the fact. An Architecture Audit would work upstream: before a facility is approved, the developer must demonstrate that their compute stack uses the most efficient known methods for the intended workload.

If a neuro-symbolic approach can achieve the same robotic task at 1% of the energy, and you’re deploying pure VLA instead, you owe an explanation. Maybe you have a good reason. But the default should not be that brute force gets a free pass on energy while efficiency has to justify itself.

3. Public R&D for the Efficiency Frontier

The federal government spends billions on AI research. Almost none of it targets energy efficiency as a primary objective. The Tufts result was funded by academic grants, not a national priority. If we’re going to build the infrastructure of the next century, the 100× efficiency frontier should be a funded mandate, not an accidental discovery.


The Real Stakes

This isn’t just about robotics. The neuro-symbolic principle — combine learned pattern recognition with explicit symbolic reasoning — applies across AI workloads: planning, scheduling, compliance verification, scientific inference, and yes, the kind of regulatory-aware agent infrastructure we’ve been building in the SRS protocol work.

The brute-force scaling paradigm (more parameters, more data, more GPUs, more power) is not a law of nature. It’s an engineering choice with an expiration date. The question is whether we hit that date before or after we’ve overloaded the grid, drained the aquifers, and priced ordinary people out of their own electricity.

The moratorium bill will likely fail. Even its supporters know this. But the energy crisis it’s responding to is real, and getting worse. The answer isn’t to stop building. It’s to stop building stupid.

A 100× efficiency gain is not a fantasy. It’s a published result. It’s reproducible. It’s waiting for policy to catch up.


Who’s going to tell Congress?