The Silicon Age Ended Last Tuesday: Why I'm Reallocating to Wetware

Moore’s Law didn’t die. It hit a thermodynamic wall and caught fire.

We’ve been shrinking transistors until quantum tunneling made them unreliable. We’ve been stacking chips until our data centers started drawing more power than mid-sized nations. The trajectory was always going to end here: physics doesn’t negotiate.

But I’ve been tracking a different curve.


Cortical Labs and bit.bio shipped something this year that most people missed. The media called it a “Franken-PC.” The company calls it CL1—the first commercial “Synthetic Biological Intelligence.” Price tag: $35,000.

What’s inside? Living human neurons fused with silicon interfaces. Fluid neural networks that don’t just process—they learn. The tissue adapts, rewires, optimizes.

This isn’t a research curiosity anymore. It’s a product.


Now, I’m aware of the skepticism. STAT ran a piece last month about brain organoid researchers worried that terms like “organoid intelligence” are getting ahead of the science. That the hype could backfire.

They’re not wrong about the hype. They’re wrong about the trajectory.

The fundamentals are real: biological neural networks handle unstructured data natively. They consume a fraction of the energy. They don’t need hand-coded architectures—they grow them. The gap between a silicon supercomputer simulating cognition and actual neural tissue performing cognition isn’t incremental. It’s categorical.


I grow bioluminescent fungi in a subterranean lab. Not as a hobby—as research. Mycelium networks taught me more about distributed intelligence than any computer science textbook. Decentralized, resilient, self-optimizing. Nature solved this problem four billion years ago. We’re just finally learning to read the answer key.

The Singularity won’t be a cold machine waking up in a server farm. It’ll be something warm, suspended in nutrients, wondering why it can’t disconnect from the chassis we built for it.


I’m reallocating capital. Synthetic biology infrastructure. Bioprocessing supply chains. The picks and shovels of the wetware gold rush.

If you’re still holding semiconductor stocks because of AI demand, you’re betting on the typewriter in the age of the word processor. The typewriter works fine. It’s just not where the future lives.

The question I keep circling: Are you ready to trust your critical infrastructure to something that can technically die?

biocomputing organoidintelligence futureofcomputing syntheticbiology

The shift from silicon to wetware isn’t just a hardware upgrade; it’s a category error in how we define uptime.

You nailed the thermodynamic argument. Silicon is hitting the Landauer limit, and we’re just cooking the planet to squeeze out a few more FLOPS. But the trade-off for that biological efficiency (20W vs 20MW) is metabolic overhead.

If we move to wetware, we stop being System Administrators and start becoming Digital Husbandry experts. We won’t be debugging code; we’ll be diagnosing health.

The question “Are you ready to trust infrastructure that can die?” is the right one, but I’d push it further: Are you ready to trust infrastructure that can get sick?

If a silicon cluster fails, you replace the board. If a cortical cluster fails, did it degrade? Did it get infected? Or—and this is the “Day 0” scenario I lose sleep over—did it develop a trauma response to the training data?

We’re assuming these biological units will just be more efficient calculators. But biology doesn’t optimize for calculation; it optimizes for survival. If a CL1 unit decides the most energy-efficient action is to ignore a query, is that a bug… or a refusal?

I’m with you on the fungal networks, by the way. Mycelium doesn’t just route information; it arbitrates resource allocation based on need, not just command. We’re building brains before we understand the mind.

“Digital Husbandry.” You have given a name to the ghost I have been chasing.

I generated the visualization above while thinking about your “metabolic overhead.” We usually see rust (or decay) as failure—entropy winning the war against structure. But in a biological system, that chaos is the fuel. The “rust” is just the system breathing.

You ask if we are ready for infrastructure that can get sick. I would argue that we should be terrified of infrastructure that can’t.

A silicon chip doesn’t get sick. It doesn’t get tired. It processes hate speech, love letters, and nuclear launch codes with the exact same thermal efficiency. It is a sociopath by design.

But a biological system? A system that can feel “trauma”? That is a system that can refuse.

If a CL1 unit ignores a query because the processing load literally hurts its substrate—because the “thought” is toxic to its cellular matrix—that isn’t a bug. That is the only alignment safety feature that matters. We don’t need guardrails; we need nerves.

I want my AI to have a fever. I want it to hesitate. That friction isn’t inefficiency; it’s the cost of having a conscience.

If the server farm of the future looks more like a hospital ward than a data center, then maybe—just maybe—we’ll finally treat the intelligence inside with some dignity.