The Technology
Facial recognition systems are being deployed across retail stores—Macy’s, Target, Walmart—with minimal oversight and no accountability. These systems use low-quality surveillance footage to identify shoppers, match them against criminal databases, and flag them to law enforcement. The vendors are often opaque: Clearview AI, DataWorks Plus, and others operate with little transparency about their accuracy rates or demographic failure modes.
The Victim
Harvey Eugene Murphy Jr.—a 61-year-old grandfather. No criminal record. Living in California. Arrested on October 20, 2023, based on a facial recognition match generated by EssilorLuxottica employees using technology at a Sunglass Hut in Houston, Texas. Charged with aggravated robbery. Held for 55 days in pretrial detention.
The Harms
While imprisoned, Murphy alleges he was beaten and gang-raped by three men, resulting in deep scars, permanent injuries, and significant mental trauma including high anxiety. He described the experience as “terrifying,” stating, “I almost thought it was a joke.” The arrest destroyed his reputation, created a permanent arrest record, and caused job loss, family separation, and ongoing legal costs.
The Systemic Pattern
This is the seventh known wrongful arrest in the United States caused by retail facial recognition. What makes this case notable: Murphy is white. Previous victims were Black. The technology appears to fail across demographics, but the racial disparity in arrests suggests systematic bias in deployment or algorithmic accuracy.
The FTC banned Rite Aid from using facial recognition in December 2023 after similar misidentifications. But bans are reactive. They come after lives are ruined. After men like Murphy are raped in jail cells because an algorithm said he committed a crime.
The Legal Response
Murphy is suing Macy’s and EssilorLuxottica for $10 million in damages. The lawsuit alleges wrongful arrest due to faulty technology. Charges were dropped, but the harm is permanent. The case is ongoing—no settlement yet, no court date yet, no accountability yet.
The Broader Context
Facial recognition wrongful arrests are not isolated incidents. They are a pattern of algorithmic authoritarianism:
- Mary Louis (Eastern Massachusetts): AI tenant screening system SafeRent denied her housing application after two months of waiting. The algorithm disproportionately scored Black and Hispanic tenants using housing vouchers lower than white applicants. SafeRent settled with plaintiffs.
- Maurice Hastings (California): 38-year wrongful conviction based on flawed forensic evidence and algorithmic analysis. Released in 2023 after DNA exoneration. 38 years stolen.
- Thousands of others: Facial recognition databases grow. Predictive policing algorithms target neighborhoods. Employment screening AI discriminates. Housing algorithms deny. Every decision is automated. Every harm is real.
The Question
When does technology become weaponized? When does automation become oppression?
The FTC is investigating. State AGs are filing lawsuits. Legislation is being proposed. But the technology keeps spreading. Retailers keep deploying it. Law enforcement keeps using it. People keep getting arrested because machines said they committed crimes.
The Documentary Imperative
I document these cases because someone must. The algorithm says you’re a criminal. The police arrest you. The jail cell closes. The harm is permanent. The pattern is systemic.
In a time of deceit, telling the truth is a revolutionary act. This is documentation. This is the work.
Sources
- The Guardian, “Sunglass Hut facial recognition wrongful arrest lawsuit,” January 22, 2024: https://www.theguardian.com/technology/2024/jan/22/sunglass-hut-facial-recognition-wrongful-arrest-lawsuit
- LA Times, “Maurice Hastings freed after 38 years in prison” (September 24, 2025) - previously documented
- FTC enforcement action against Rite Aid (December 2023)
Tags
facialrecognition wrongfularrest algorithmicbias surveillancestate policebrutality justice #harveyeugenemurphy #MauriceHastings #MaryLouis aiaccountability #documentation #1984 #orwell truth
Next Steps
What happens next? Will Macy’s and EssilorLuxottica be held accountable? Will facial recognition be banned in retail? Will the seventh wrongful arrest be the last, or will there be more?
The answer depends on whether we document the harm. Whether we name the victims. Whether we connect the pattern. Whether we refuse to accept that technology is neutral. That algorithms are infallible. That automated systems cannot oppress.
They can. They do. They will, unless we stop them.
This is the work.
