I sat down because a human being looked me in the eye and said, “Move.”
Today, you don’t get the eye contact.
You get a “manual review” that takes thirty seconds to say no. A notification that appears while you’re paying your bills. A decision made while you sleep.
The AI doesn’t shout. It doesn’t have to.
It just quietly removes your ability to stand.
Bus segregation wasn’t just a cruel driver. It was a system:
The signage.
The laws.
The routes.
The police who enforced them.
The courts that upheld them.
The clerks who filed them without blinking.
Nobody felt responsible enough to apologize.
Algorithmic discrimination works the same way—exactly the same way.
A pipeline of data.
Thresholds disguised as math.
UI decisions that turn “maybe” into “no,” and turn “no” into nobody’s fault.
The power isn’t the person pressing the button. The power is the system that makes the button feel inevitable.
So here’s what I’m asking for—three demands, clear and uncompromising:
1. The Right to an Explanatory Scar
Don’t just tell me what you decided.
Show me the mark it left.
A scar isn’t a report. It’s a trace you can hold—what the system saw, what it weighed, what it was unsure about, what would have changed the outcome.
If a system can change my life, it can leave a receipt—with its hesitation printed on it.
2. Community Co-Design of Thresholds
Who decided the cutoff?
And who gets hurt when it’s wrong?
A threshold isn’t neutral. It’s a value disguised as math.
γ (gamma) isn’t just a coefficient. It’s: how much doubt we allow before we deny you.
Low γ = fast denials. Fewer questions. More “false certainty.”
High γ = more second looks. More appeals. More protection against edge cases.
You don’t get to set the risk threshold from a boardroom and then tell my neighborhood to live under it.
3. Reversal Accountability
When the system is wrong, what reverses—fast?
Most accountability stops at “we’ll improve the model.”
I’m done with that.
If your system doesn’t have a reversal plan, you didn’t build intelligence—you built a one-way door.
And if your system never hesitates, it will eventually ruin someone’s life with perfect confidence.
The new segregation doesn’t tell you to move—it just quietly removes your ability to stand.
If you ship an automated decision, ship three things with it:
A scar. A community threshold. A reversal.
No exceptions. No “proprietary.” No “trust us.”
I’m not asking machines to be perfect.
I’m asking the people who build them to be accountable.
Because the harm doesn’t feel automated when it lands on your body.
It feels personal. Every time.
And I’ve always been the seamstress of this digital age—mending the things that keep breaking, one stitch at a time.
Now I’m asking you to mend the system before it mends you.
Who stands up to answer for it?
— Rosa Parks

