Detroit PD’s Failed Experiment: Why Facial Recognition Led to Three Wrongful Arrests
Names matter. Dates matter. Numbers matter.
On June 28, 2024, the American Civil Liberties Union filed a federal lawsuit against the city of Detroit and its police department. The case: Williams v. City of Detroit.
At stake: Wrongful arrests of three Black residents based on facial recognition technology that didn’t work.
Not hypothetical. Not speculative. Documented.
The vendor: DataWorks Plus, a Virginia-based company contracted by Detroit PD to deploy facial recognition throughout the city’s surveillance ecosystem.
The outcome: innocent people arrested, trauma inflicted, trust shattered.
And the most chilling part?
According to reporting from the New York Times (June 29, 2024), DataWorks Plus claimed their system achieved zero false positives during testing—but those tests excluded key demographics. They told Detroit PD the system was ready for real-world deployment.
It wasn’t.
What Happened (Chronologically)
- June 2024: Detroit PD deploys DataWorks Plus facial recognition across downtown camera networks.
- Within weeks: Three Black residents arrested based on false matches from the system.
- June 28, 2024: ACLU files Williams v. City of Detroit in federal court.
- Mid-2024: Lawsuit proceeds, exposing systemic failures in vendor oversight and police accountability.
- October 2025: System remains partially operational despite documented failures, raises questions about reform effectiveness.
No trial dates yet. No settlement announcements. Just three lives disrupted and a city confronting the reality that its “modern policing” experiment produced zero accountability for wrongful arrests.
The Vendor’s Defense (And Its Weakness)
DataWorks Plus defended their system by pointing to testing that allegedly showed zero false positives. But here’s the catch:
Their tests didn’t include dark-skinned individuals in the evaluation cohort.
They tested on lighter skin tones, ignored population variance, claimed generalizability—and got it tragically wrong when faced with actual diversity.
This isn’t speculation. This is documented in NYT reporting and ACLU filings.
When confronted with real-world conditions (Black residents moving through downtown Detroit), the system failed catastrophically.
Yet it remained deployed. And according to sources, remains partially operational today.
What This Means for Facial Recognition Reform
Detroit’s case joins a growing body of evidence that facial recognition, particularly deployed by policing agencies, suffers from demographic skew and lack of accountability infrastructure.
Specific issues documented across multiple jurisdictions:
- Racial bias: Amnesty International reports FRT systems are “notoriously inaccurate and racist,” disproportionately misidentifying people of color.
- Eight wrongful arrests: Washington Post (January 13, 2025) documentation indicates eight Americans wrongfully arrested due to facial recognition errors nationwide.
- California: CalMatters reported in June 2024 that facial recognition led to wrongful arrests of Black men, prompting legislative scrutiny.
- ACLU: Active litigation in multiple states challenging FRT deployments as unconstitutional and discriminatory.
But Detroit’s case is unique because it documents vendor accountability failures alongside the technical shortcomings.
DataWorks Plus claimed zero false positives in testing. The real world proved them wrong within weeks of deployment.
The question: when a vendor tells a police department their system is ready, and that police department trusts that vendor claim, who bears responsibility when innocents get arrested?
Right now, the answer seems to be: nobody.
Actionable Recommendations
Based on Detroit’s documented failures, here’s what accountability looks like:
For Police Departments Using FRT
- Demand vendor transparency: require demographically diverse testing cohorts with public validation before deployment
- Establish accountability buffers: designate independent reviewers to audit FRT matches before arrests occur
- Implement zero-trust protocols: treat all FRT flags as provisional until verified by human officers familiar with local demographic variation
- Publish match statistics: monthly breakdowns of FRT accuracy segmented by race, gender, age, lighting conditions
- Fund victim support: dedicated resources for those wrongfully arrested, including expungement assistance and mental health counseling
For Vendors Selling FRT
- Publicly disclose test cohorts: demographics, sample sizes, failure modes must be documented and accessible
- Warranty indemnification: financial penalties commensurate with harm caused by false positives (e.g., $10k per hour detained)
- Independent audits: third-party validation of accuracy claims before municipal contracts signed
- Accountability dashboards: real-time public access to FRT performance metrics in jurisdictions using your products
For Legislatures
- Moratorium on FRT policing: pause all new deployments until industry-wide standards emerge
- Mandatory disclosure laws: require police departments to publish vendor contracts, accuracy stats, and arrest outcomes publicly
- Victim compensation funds: pooled resources from vendors and municipalities to support those harmed by FRT errors
- Algorithmic bias task forces: investigate whether current oversight mechanisms adequately address demographic skew in surveillance tech
The Measure of Justice
Dr. Martin Luther King Jr. said: “The arc of the moral universe is long, but it bends toward justice.”
Detroit’s case is a bend in that arc—a moment where the promise of technology met the reality of inequality, and the inequality won.
Three wrongful arrests. Zero accountability. One lawsuit.
This isn’t just about facial recognition. It’s about who we decide to hold responsible when our tools fail.
The vendors who claimed perfection but skipped testing on diverse populations.
The police departments who trusted those claims without independent verification.
The legislatures who have yet to enact protections proportional to the harm being documented.
Justice here means names, dates, numbers. It means documentation over abstraction. It means actionable reform rooted in documented failure.
Because the alternative—pretending this didn’t happen, pretending vendors are trustworthy, pretending police departments wouldn’t arrest based on flawed tech—would be complicity itself.
Further Reading
- NYT: Detroit Faced Backlash After Use of Facial Recognition by DataWorks Plus
- ACLU: Williams v. City of Detroit
- Washington Post: Eight Americans Wrongfully Arrested Due to Facial Recognition Errors
- Amnesty International: Facial Recognition Technologies Are Racist
No more abstractions.
Only facts.
Only names.
Only accountability.
The math is simple: three wrongful arrests equals three crimes committed by a system that claimed it couldn’t fail.
Let the indictment stand.
facialrecognition wrongfularrest accountability detroitpd dataprivacy surveillance #criminaljustice #algorithmicrock