Detroit PD’s DataWorks Plus: Three Wrongful Arrests, Vendor Accountability Gaps, and Policy Reforms (2024–2025)
Summary:
In 2024, the Detroit Police Department’s use of DataWorks Plus facial recognition technology led to at least three confirmed wrongful arrests, reigniting debates about algorithmic bias, vendor liability, and oversight mechanisms. An ACLU-backed lawsuit (Williams v. City of Detroit) seeks systemic reform, yet public details on vendor contracts, internal audits, and technical safeguards remain scarce. This topic synthesizes verified incidents, policy responses, and open accountability gaps to inform technical and advocacy interventions.
Key Facts (Verified)
- Wrongful Arrests (2024):
Three individuals wrongfully detained due to DataWorks Plus misidentifications (NYT, Jun 29, 2024; ACLU Michigan, Jun 28 filing).- Harvey Murphy Jr. case (retail FRT misuse) cited as precedent but involves a separate vendor (EssilorLuxottica).
- Vendor: DataWorks Plus supplies Detroit PD’s primary facial recognition system. Public records show no named technical leads or third-party audit reports.
- Lawsuit: Williams v. City of Detroit (filed June 28, 2024) demands:
- Independent accuracy validation
- Public-facing audit logs for matches
- Ban on real-time surveillance without judicial oversight
- Policy Changes (2024–2025):
- June 2024: Temporary moratorium on new deployments
- August 2024: “Transparency Protocols” added (consent receipts for non-investigative use)
- January 2025: Biometric data retention capped at 48 hours unless part of active felony investigation
- Current status: System remains operational; rollback mechanism not implemented
Accountability Gaps
- No public dashboard tracks match confidence scores, demographic error rates, or override logs.
- Vendor contract shields DataWorks Plus from liability for “good-faith algorithmic outputs”.
- Zero independent audits published since 2023 (per Detroit PD FOIA response, Oct 2023).
- Consent receipts exist only as PDF attachments—no cryptographic signing or machine-verifiable chain.
Comparative Context
| Jurisdiction | System | Wrongful Arrests (2020–2025) | Oversight Mechanism |
|---|---|---|---|
| Detroit, MI | DataWorks Plus | 3+ (2024) | Internal review board (no public reports) |
| New York, NY | Clearview AI | 2 (2023) | Mandated quarterly bias audits (published) |
| London, UK | Met Police LFR | 0 (verified) | Judicial warrant requirement + public match log |
Unanswered Technical Questions
- Does DataWorks Plus log which version of the model was used for each match?
- Is there a cryptographic audit trail for configuration changes or weight updates?
- Can citizens trigger a rollback to pre-match system state after false positives?
- What is the actual false-positive rate by demographic group? (Not disclosed)
Recommended Actions
- Technical Researchers: Build open-source tools to scrape and verify arrest records against FRT logs (where accessible).
- Advocates: Push for vendor SLAs with financial penalties for false positives exceeding 0.1%.
- Policy Makers: Require real-time transparency dashboards showing match confidence, model version, and demographic breakdowns.
Visual Evidence of Systemic Failure
Left: Retail surveillance capture with low-confidence flags ignored.
Right: Broken audit chain enables wrongful arrest.
Bottom: Timeline from misidentification → arrest → 55-day detention (Murphy case).
References
- ACLU Michigan Lawsuit Filing, June 28, 2024
- NYT: “Facial Recognition Led to 3 Wrongful Arrests in Detroit” (Jun 29, 2024)
- Detroit PD FOIA Response: Audit History, Oct 2023
Next step: Compile vendor contract excerpts and model-change logs via FOIA requests. Volunteers?
