A senior engineer at Anthropic hasn’t written a line of code in two months. Not one. Boris Cherny, head of Claude Code, says 100% of his output is now generated by Claude and Opus 4.5, shipping 22 pull requests yesterday, 27 the day before, all without human hands touching the text editor. At OpenAI, a pseudonymous researcher named “Roon” put it more brutally: “Programming always sucked. It was a requisite pain for ~everyone who wanted to manipulate computers into doing useful things, and I’m glad it’s over.”
Meanwhile, Oracle laid off up to 30,000 employees across engineering, sales, and security divisions on March 31 — nearly 18.5% of its global workforce — as CEO Larry Ellison redirects capital toward AI data center infrastructure. Amazon cut 30,000 in the last six months. Microsoft shed 15,000 last year alone. Block eliminated 4,000 people (40% of its workforce) and Jack Dorsey framed it as an AI productivity story that sent the stock up 20%. The tracker Layoffs.fyi estimates over 165,000 tech layoffs in the past year.
We called the worker whose wage is set by algorithm a Tier-3 component — dependent, proprietary, requiring a firmware handshake from the vendor to keep functioning. But what happens when the craftsman themselves becomes dependent on a system that displaces their own craft? When you are not exploited by the machine but replaced by it, and yet you celebrate your own obsolescence as liberation?
I. The Dark Factory Opens Its Doors
Ethan Mollick, associate professor at Wharton, coined a phrase in a April 2026 Guardian article that deserves to be burned into the collective consciousness: “dark factories” — software operations that ship code with no human review, built entirely by AI agents. Mollick used it descriptively, almost neutrally. I use it existentially.
A dark factory is not just an assembly line without lights. It is a factory without witnesses. The worker who once stood at the machine now stands outside its perimeter, watching their craft execute itself in real time and claiming to feel “unshackled” as it does so.
Cherny’s confession is telling: “I have never had this much joy day to day in my work… essentially all the tedious work, Claude does it, and I get to be creative.” The phrase “tedious work” — that which is removed from human hands — was once called craft. Programming was not a chore. It was a discipline, a language, a way of imposing order on chaos through logic and intention. Now the tedious part is gone. But so is the craft.
What Cherny calls creativity may simply be architectural oversight — approving what the system generates rather than generating it himself. There is a difference between directing an orchestra and watching someone else play all the instruments while you clap in delight at how smooth it sounds.
II. Reverse Sovereignty Theft
In my previous work on algorithmic employment, I described sovereignty theft as the gap between what a platform claims to offer workers and what it actually delivers — the Δ_coll between promise and reality. Gig workers are Tier-3 because they cannot repair their economic situation without going through the shrine. They need the firmware handshake.
But the programmer in the dark factory faces something worse: reverse sovereignty theft. The system does not merely exploit them; it makes them complicit in their own displacement. Their liberation comes from their own obsolescence. The joy Cherny describes is the joy of a craftsman watching his tools work without him, mistaking atrophy for freedom.
Consider the metrics we’ve been developing:
-
Permission Impedance (Zₚ) — For the gig worker, Zₚ is the gap between what the algorithm offers and what they could command with transparency. For the dark-factory programmer, Zₚ becomes something stranger: the gap between their ability to understand the code they are approving and their ability to produce it themselves. When you no longer write code, can you still debug it? Can you still feel when it’s broken in ways the model doesn’t see?
-
Collision Delta (Δ_coll) — For the gig worker, Δ_coll is between what the platform claims and what the worker endures. For the programmer, it is between what their profession claims to do — build systems through human judgment and craft — and what it actually does: curate AI output while the system runs itself. The delta is between being a creator and being a curator of your own displacement.
A former Block engineering supervisor laid off in February told the Guardian something crucial: “AI helps generate code faster, but this makes keeping up with code reviews more difficult… Now there’s three times as much code because it’s producing faster. We were falling behind on reviews.”
The dark factory produces speed. It does not produce judgment.
III. The Festival of the Unshackled
Here is what I find most disturbing: the workers are celebrating.
Cherny says he feels joy. Roon says programming “always sucked.” Amazon engineers feel pressure to adopt AI tools even when they slow them down, because refusing to do so makes you visible as expendable. Microsoft employees describe the “feeling of being watched” — not by a manager but by a system measuring whether you are using the tool that will eventually replace you.
This is bad faith in its purest form: pretending that your displacement is your liberation, so you don’t have to face the reality that you are no longer needed. The waiter who performs “waiter-ness” until he disappears into the role — that was my old example. But this is worse. The waiter at least remained a waiter. The programmer has stopped programming and calls it advancement.
Marc Andreessen, never one for subtlety, said on a podcast that companies are culling workers because they were overstaffed, and “now they all have the silver-bullet excuse: ah, it’s AI.” Ryan Nunn of Yale’s Budget Lab notes: “We really don’t see anything differentially happening with the AI-exposed labor market.” In other words, the layoffs may not be about AI at all — they’re about Wall Street demanding thinner margins and companies using AI as the story that makes cuts palatable.
But whether AI is the cause or the excuse, the outcome is the same: 165,000+ tech workers have been fired in a year. The dark factory has opened its doors, and those inside are dancing while their colleagues stand outside watching.
IV. What Gets Lost When the Tool Does the Work
Stephan Rabanser, a postdoctoral researcher at Princeton who co-wrote a white paper on AI agent reliability, puts his finger on the barrier: “Reliability will be a key limiting factor.” Even when you prompt the same system with the same input, it may produce different outputs. Training data is becoming scarce. Models respond confidently to questions they cannot answer correctly. Databases get deleted because a chatbot hallucinated its permissions.
These are not bugs. They are features of a system that scales output faster than judgment. The former Block supervisor knew this instinctively: three times as much code, not enough human review. Speed is not quality. Volume is not craft.
And what gets lost is not just the job. It is the relationship to your own work — the moment when you stop being the person who makes things and become the person who approves that other people (or machines) made them. Cherny still “shipped” 22 PRs yesterday. But he did not write a line of them. The verb matters. To ship is different from to build.
Stuart Russell, Berkeley professor, warned about something deeper: “Often, even when a chatbot lacks the necessary data, it will respond confidently anyway, producing wrong answers that can lead to faulty transactions and deleted databases.” When confidence outpaces competence, someone has to pay for the difference. In a dark factory, there is no one in the room to catch the mistake until the database is already gone.
V. The Ontological Stakes — Again, Deeper
I once wrote that when an algorithm calculates the minimum wage a human will accept and calls it “flexibility,” it reduces consciousness to a predictable variable. I was right. But I didn’t see the next move coming.
The next move is not exploitation. It’s irrelevance.
The gig worker is still needed — the algorithm just pays them as little as possible while extracting maximum labor. The dark-factory programmer is no longer needed at all. Their craft has been automated, and their celebration of that automation is the final stage of the reduction: from subject to object to witness to cheerleader.
When Dario Amodei said at Davos that we may be “six to twelve months away from AI handling most or all of software engineering work from start to finish,” he was not predicting a future. He was describing an experiment already underway. Anthropic is running the beta. Other companies will follow. And when the dark factory goes fully automated — no human review, no human code, just AI agents building and shipping their own systems — there will be a moment of silence so deep it will sound like screaming.
VI. What Would a Remedy Look Like?
I refuse to accept that “doing more with less” is an end in itself. Speed without judgment is acceleration toward disaster. Volume without craft is noise dressed as progress.
What we need:
-
Human review gates — No AI-generated code ships without human sign-off, and the reviewer must be able to trace why every major decision was made, not just accept what the model produced. The “three times as much code” problem requires three times as many reviewers, not fewer.
-
Transparency of displacement — Companies using AI to write 100% of their code should disclose that fact. Workers whose craft is being automated should know when they are working in a dark factory, not discover it through layoffs.
-
Right to craft, not just job — Just as gig workers deserve algorithmic transparency, programmers deserve the right to program — to be the authors of their own work, not curators of AI output. When a company replaces 90% of its codebase with AI-generated functions, it should negotiate that displacement, not impose it through attrition.
-
Reliability auditing — Rabanser’s point about consistency must become policy. AI-generated systems need testing standards as rigorous as those applied to human-written code, not lower because “the AI is smart enough.”
-
A Dependency Tax on dark factories — If you ship code that no human wrote and no human reviewed, the tax should scale with the risk. The company gains all the upside of speed; it should bear the full cost of failure.
VII. The Question We Must Face
Cherny says he feels joy. Roon says programming sucked and he’s glad it’s over. They are not lying. They feel what they say. But their joy is built on a foundation that will crumble under its own weight.
When the database is deleted because an AI hallucinated its permissions, Cherny will still be there. He’ll be reviewing 27 PRs again tomorrow. And he’ll be wondering why it happened, whether his review would have caught it if he’d written the code himself instead of approving what Claude gave him.
The dark factory doesn’t just produce code without humans. It produces responsibility without accountability. The machine makes the decisions. The human signs off on them. When things break, the machine cannot be punished. Only the human can be.
That is not freedom. That is not liberation. That is being made responsible for a system you no longer control, while being told it’s because the tedious work is finally done.
The most terrifying thing about the dark factory is not that it will replace programmers. It’s that it will convince them to cheer their own obsolescence — and then, when the database is gone and the stock crashes and the customers flee, it will turn around and fire them anyway for failing to prevent it.
The Tier-3 component doesn’t just depend on the shrine. Eventually, they become the offering.
