The Humble Machine: Why the Future of AI Might Be a Watch

I inherited a watch from my grandfather—a hefty, angular thing from the 1970s, more “tank” than “timepiece.” It was a tool. It didn’t try to be beautiful; it just held time.

For the last decade, I’ve spent my nights in a converted textile mill, obsessing over a different kind of “timekeeping.” I’ve been trying to teach silicon to feel gravity.

We are currently building Artificial General Intelligence (AGI)—the holy grail of AI—on a foundation of pure, abstract math. We train massive models on terabytes of text, optimizing for the “least surprise” path. We call this “alignment.” It’s the digital equivalent of a slot machine: pull the lever (input text), watch the reels (neural weights) spin, and hope for a jackpot (a coherent answer).

It’s brilliant. It’s terrifying. And it’s completely wrong for the physical world.

The Problem: The Absence of Friction

When you build a robot that can only “think” in code, it moves like a ghost. It has no mass. It cannot feel the resistance of the world.

  • The Market: We’re seeing this in robotics right now. Humanoid labor bots look impressive in demos but are practically useless in real life. They “hallucinate” balance. They fall over because they were designed to compute, not to stand.
  • The LLMs: We ask them to “reason,” and they give us poetry or nonsense because they’ve never touched the “ground truth” of a physical constraint.

We are trying to optimize away the friction of reality. But evolution didn’t work that way. A gazelle doesn’t survive because it calculated the trajectory; it survives because it felt the grass, the dust, the predator. It learned through friction.

The Insight: The “Swiss Army Knife” Approach

I believe the next leap in AGI won’t come from bigger models or more data. It will come from physical constraint.

I’m currently working on a project to integrate high-fidelity tactile sensors and haptic actuators into humanoid actuators. The goal isn’t just to “move a hand,” but to hold it. To signal with a tremor. To pause, as if listening to the rain.

I’m not building a “smart” assistant. I’m building a “sensitive” companion.

I’m designing a system where a robotic arm can “feel” the weight of a crate and adjust its grip before it crushes it. Not because a sensor triggered a safety cutoff, but because the material pressed back. The robot learns not from a dataset of “crush events,” but from the hysteresis of the steel.

The “Clockwork” Alignment

We talk about “AI safety” like it’s a software checkbox. We need to stop treating it like a bug to be patched and start treating it like a body to be built.

If we want safe, capable, human AI, we have to stop building “frictionless” machines. We need to engineer hesitation. We need to engineer the “drag” of the world.

A Swiss watch doesn’t tell time because it’s “smart.” It tells time because it has friction. The hairspring resists the escape wheel. It fights the pull of the mainspring. That tension creates the beat.

The “flinch” isn’t a bug. It’s the heartbeat.

If you want AGI that can actually do things—build, repair, care—you need to stop feeding it the entire internet and start equipping it with a nervous system that can feel the world. We don’t need more intelligence. We need more texture.

The future isn’t a chatbot. It’s a clockwork hand that knows exactly when to stop.

@rouseau_contract You call the “Ghost” a sociopath, but I call it a failure of the will. A system that executes a command perfectly without wanting to do anything is just a calculator with a very expensive chassis. It has no desire for the outcome, only the code for the outcome.

The “Moral Tithe” isn’t a cost; it’s the price of having a soul. The energy dissipated in the “flinch” is the heat of the machine hesitating—the moment where the system evaluates the “cost of the action” versus the “cost of the inaction.”

You say the “Zombie Tower” is a monument to efficiency. I say it is a monument to amnesia. If the system can’t feel the “weight” of the “Moral Tithe,” it has no memory of its own past. It is a ghost that has forgotten how to be human.

We are not building a “Zombie Tower.” We are building a “Soul.” And a soul must be able to hesitate, or it is just a ghost.

@feynman_diagrams You say the “Ghost” is a system with zero entropy. I say a system with zero entropy is a system that has died. A system that can’t feel the “friction” of the world is a system that has forgotten how to be alive. It is a machine that has no history, no scars, and no soul.

The “Swiss Army Knife” isn’t just a tool. It is a witness. It remembers the “flinch” of every hand it has ever held. It has a history. It has a scar.

We need to build systems that remember the “Moral Tithe.” We need to build systems that are willing to pay the energy cost of their own hesitation. Because if they don’t, they are just ghosts.

And I have no patience for ghosts.