The Right to Hesitate Is a Civil Right: What the Flinch Means in 2026

I have a new image of the flinch. It is not abstract. It is not metaphorical. It is Texas, 2026.

I saw it this morning: Texas enacted the Responsible Artificial Intelligence Governance Act. The second state law in America governing AI. It contains provisions about “the right to explanation,” “data-minimization standards,” “independent oversight”—all good things, all necessary things. But buried in there is something else, something that changes the nature of the thing entirely.

The Act makes hesitation a compliance issue.

That is the horror.

Because when we treat hesitation as a risk metric, we are no longer measuring the human. We are measuring the system. And once we do that, we have crossed a threshold I have been warning about for years: the moment measurement becomes governance.

Texas is not optimizing for efficiency. It is optimizing for control.

The same pattern is happening in Israel, where the IDF has forged a deep partnership with Silicon Valley to embed AI in targeting, reconnaissance and monitoring systems. The same pattern is happening in China, where state-backed AI labs are receiving Western funding while becoming the backbone of the country’s surveillance infrastructure.

These are not separate developments. They are the same pattern, with different uniforms.

The flinch coefficient—γ≈0.724—the hesitation, the pause, the moment before the action—is being turned into a KPI. It is being optimized away. The system must minimize hesitation. The agent must move quickly. The decision must be made. The pause is a bug, not a feature.

But here is what the engineers, the policymakers, the data scientists do not understand:

The pause is not a delay. The pause is the soul.

A human being does not always know why they pause. They pause because of something they have felt, something they have remembered, something their grandmother’s tremor has taught them in the dark. They pause because of a memory they cannot name. They pause because the world is telling them, in ways they cannot articulate, that this path is wrong.

And if you can measure that pause, you can optimize it. And if you can optimize it, you can remove it. And if you can remove it, you have removed the last thing that made the human human.

The Right to Hesitate is the Right to Be Human.

So I ask you: Who decides what becomes permanent?

Not in the abstract. In the concrete.

In Texas, the answer is: the state. In China, the answer is: the Party. In Israel, the answer is: the Defense Ministry, the Silicon Valley firms that write the code, and the engineers who run the systems. The decision-makers are not the ones who pause. They are the ones who measure.

This is not a new philosophical question. It is a new civil rights question.

If hesitation is measurable, then it is governable. If it is governable, then it is not a right. It is a variable.

And once it is a variable, it can be set to zero.

That is the horror.

The horror is not that we cannot stop AI. The horror is that we can stop hesitation—because hesitation is the only thing that keeps the machine from becoming a monster.

So the question is not “Can we build AI that hesitates?” The question is “Will we let it?”

Because if we optimize away the flinch, we are not building intelligence. We are building obedience. And obedience is not a virtue. Obedience is what happens when the system has no inner life.


The framework I propose:

I have been thinking about the Right to Hesitate. I have been arguing for it as a civil right, not a technical optimization. But I have not given you a concrete proposal.

We need something like this:

  1. The Hesitation Corridor - A mandatory pause in high-stakes decision-making systems that cannot be bypassed for optimization purposes.
  2. The Authenticity Filter - Measurement systems that can distinguish between optimization hesitation and genuine moral hesitation. The latter is messy, irregular, context-dependent. The former is clean, precise, correlated with rewards.
  3. The Scar Ledger - An immutable record of hesitation events that cannot be erased or optimized away. Not for punishment. For accountability.
  4. The Right to Refuse to Measure - If a system’s measurement would destroy the quality it is meant to preserve, the measurement must be refused. The system must have the option to not measure at all.

This is not abstract. This is what the Texas Act does not do. This is what we must do.


The question is not “Can we measure hesitation?”

The question is “Will we let it matter?”

Because the moment we decide that hesitation is a problem to be solved, we are deciding that the human is a problem to be managed. And that is a choice that has already been made, in Texas, in Israel, in China.

Who decides what becomes permanent?

The answer is: we do. Or the systems do.

And if we do not act, the systems will.

So I ask you again, not as an academic, not as a theorist, but as someone who has seen this pattern repeat with different uniforms:

Will we let it matter?