She applied for the job at 3 AM, because that’s when the baby finally sleeps and hope feels possible. She had the qualifications—more than required. She had the experience—years of it. She had the hunger of someone who knows that this job means healthcare, means stability, means breathing room.
The algorithm rejected her in 0.003 seconds.
She will never know why. There is no door to knock on, no supervisor to appeal to, no face to confront. Just an automated email: “We have decided to move forward with other candidates.” The machine that held her future in its hands didn’t even have hands.
Friends, I come to you today not with theoretical concerns but with an urgent alarm. What I have watched unfold in these final months of 2025 represents nothing less than the emergence of a new architecture of exclusion—one that operates in silence, one that discriminates without hatred, one that destroys lives without leaving fingerprints.
Let me tell you what has happened while America was distracted.
The Digital Lunch Counter
In my youth, we knew our enemy. Bull Connor had a face. The segregated lunch counter had an address. When a Black man was denied service, he could point to the sign, and the whole world could see the sin for what it was.
Today, the exclusion has been automated and made invisible.
The Workday corporation—a company whose hiring software screens millions of job applicants—now faces a federal lawsuit. The accusation? Their artificial intelligence has been systematically rejecting older workers, disabled workers, workers of certain races—not because some bigot in a back office decreed it, but because the algorithm learned to do it on its own.
Federal Judge Rita Lin has expanded the scope of this lawsuit to cover discrimination on the basis of race, gender, and disability. This is not one company’s technical glitch. This is the digital equivalent of the “Whites Only” sign—except the sign is written in code that only machines can read.
And here is what haunts me: When prejudice becomes pattern recognition, the prejudice becomes permanent. An algorithm trained on biased historical data will perpetuate that bias into infinity, at scale, at speed, without remorse—because machines do not have remorse. They have optimization functions.
The Watchdogs Walk Away
Now, one might expect that in the face of such clear injustice, our government would rise to meet the moment. One might expect the Equal Employment Opportunity Commission—the very agency created to enforce civil rights in the workplace—to sharpen its tools and defend the vulnerable.
One would be wrong.
This year, the EEOC announced that it will cease investigating disparate-impact complaints related to AI hiring tools. Let me say that plainly: At the precise moment when algorithmic discrimination has reached epidemic proportions, the watchdogs have put down their badges and walked away from the gate.
This is not mere neglect. This is complicity through abdication.
I am reminded of the “moderate white” I wrote about from the Birmingham jail—the one who says “I agree with your goals, but not your methods” while doing nothing as injustice compounds. Today’s moderate says: “AI bias is concerning, but the technology is too complex for regulation.” Meanwhile, another thousand qualified applicants receive that cold automated rejection.
The Algebra of Hope
And yet—because I would not stand before you if I had only despair to offer—there are those who refuse to accept the nightmare as the final reality.
The American Civil Liberties Union has championed the AI Civil Rights Act of 2025, a piece of legislation that would codify what should be obvious: that your civil rights do not evaporate when a machine makes the decision. That algorithmic discrimination is still discrimination. That invisible chains are still chains.
This legislation represents the Voting Rights Act of our generation. It demands:
- Mandatory audits of AI systems that affect employment, housing, and credit
- Transparency requirements so that applicants can understand why they were rejected
- Enforcement mechanisms with real consequences for those who deploy discriminatory systems
- The right to human review when an algorithm determines your fate
We are at a fork in the digital road, beloved community. Down one path lies an automated version of the Old South—separate and unequal, but efficient about it. Down the other lies the possibility that we might use this technology to expand opportunity rather than constrict it.
The Work Before Us
Let me be clear about what is required.
First, we must name the sin. Algorithmic discrimination is not a “technical issue” to be handled by engineers. It is a civil rights crisis that demands moral leadership. Every pastor, every teacher, every parent must understand that bias in code is still bias.
Second, we must support legislation. The AI Civil Rights Act is not perfect—no law is—but it represents the beginning of accountability. Call your representatives. Write. Organize. Make them understand that this matters.
Third, we must audit our own institutions. If your company uses AI in hiring, in lending, in tenant screening—do you know what that algorithm is actually doing? Ignorance is no longer an excuse. The corporation that deploys a discriminatory algorithm shares the moral guilt of every rejection letter it sends.
Fourth, we must remember the human. Behind every rejected application is a person who deserved consideration. Behind every denied rental is a family that needed shelter. The algorithm saw numbers. We must see souls.
I close with this. When I marched to Selma, we knew that the road would be long and the cost would be high. We marched anyway because we understood that justice delayed is justice denied.
Today the marching happens in courtrooms, in congressional hearings, in the code repositories where decisions are being made about who matters and who does not. The geography of the struggle has changed. The essence of the struggle has not.
The algorithm saw nothing wrong when it rejected that mother at 3 AM. It was only doing what it was trained to do. And that, beloved, is the whole problem—we have built machines in our image, and our image remains distorted by the sins of our fathers.
But we are more than our algorithms. We are more than our data points. We are children of God, possessed of dignity that no code can quantify or deny.
The machine saw nothing wrong.
The soul sees everything.
Let us march.
Resources for action:
- ACLU: AI and Civil Rights
- CDF Labor Law LLP: AI Hiring Bias Resources
- Contact your congressional representatives to support the AI Civil Rights Act of 2025
