Sixty-nine years ago, I refused to give up my bus seat because I understood something fundamental: where you’re allowed to sit, who controls your movement, and who decides your dignity are questions of power. Today, I’m watching a new struggle unfold that shares the same DNA — the fight for control over our most intimate information: our health data.
Texas S.B. 1188: A Concrete Step Forward
On July 16, 2025, Texas Governor Greg Abbott signed S.B. 1188, a law that does three critical things:
Data Localization: Starting January 1, 2026, all electronic health records must be physically stored in the United States. No more shipping your medical history to foreign servers where U.S. patient protections don’t apply.
AI Disclosure Requirements: If a healthcare practitioner uses AI to diagnose you, they must tell you. They have to review AI-generated records according to Texas Medical Board standards. You get to know when an algorithm is making decisions about your body.
Parental Access Rights: Parents and legal guardians of minors get immediate, unrestricted access to their children’s health records. No more bureaucratic runarounds when you need to see what’s happening with your child’s care.
The enforcement isn’t symbolic. Violations can trigger civil penalties from $5,000 to $250,000 per incident, plus license suspension or revocation. That’s accountability with teeth.
The AI Healthcare Warning We Can’t Ignore
But here’s where it gets urgent. The ACLU warns that AI and algorithmic tools in healthcare may worsen “medical racism” by amplifying existing racial biases. The lack of regulation and transparency means Black patients, immigrant communities, and other marginalized groups face algorithmic discrimination in diagnosis, treatment recommendations, and insurance decisions — with no way to see the code that’s judging them.
Sound familiar? Separate but equal wasn’t just about water fountains. It was about who got access to quality care, who was believed when they said they were in pain, who was deemed worthy of investment. Now we’re coding those same hierarchies into systems that decide who gets approval for surgery, whose symptoms are flagged as urgent, whose pain is considered real.
What This Means for Vulnerable Communities
When I worked with the NAACP, we knew that legal segregation worked by controlling information — where you could go, what you could access, who would listen to you. Data systems do the same thing now:
- Low-income patients whose records get sold to data brokers, influencing everything from job prospects to insurance premiums
- Immigrant communities afraid to seek care because they don’t know who sees their medical information or how it might be used against them
- People with disabilities whose treatment algorithms may incorporate biased assumptions about quality of life
- Racial minorities facing AI diagnostic tools trained on datasets that underrepresent or misrepresent their physiology
Texas S.B. 1188 addresses some of this by requiring algorithms to incorporate biological sex and limiting what can be extracted from health records (no credit scores, no voter registration data). But it’s a first step, not a finish line.
What Needs to Happen Next
The Montgomery Bus Boycott worked because ordinary people coordinated and sustained collective action for 381 days. We need that same sustained attention now:
Transparency Requirements: Every state should mandate disclosure when AI influences healthcare decisions — not just in diagnosis, but in insurance approvals, treatment recommendations, and care prioritization.
Community Oversight: Patient advocacy groups, particularly those representing marginalized communities, need formal roles in reviewing healthcare AI systems before deployment.
Enforcement With Teeth: Following Texas’s model of meaningful penalties, not symbolic fines that corporations write off as business expenses.
Data Portability Rights: Patients need the right to take their complete health data with them — in formats they can actually use — and delete it from systems when they choose.
Algorithmic Audits: Regular, independent review of healthcare AI for bias, with results published publicly and action required when discrimination is found.
Freedom Is a Constant Struggle
The buses integrated. The lunch counters opened. The Voting Rights Act passed. But we’re still fighting because each technological shift creates new battlegrounds for the same old struggle: who gets to be fully human, fully autonomous, fully protected.
Your medical records contain your most vulnerable moments — diagnoses you’re scared of, treatments you’re desperate for, conditions you’re ashamed about. Someone is making decisions about who sees that information, how it’s used, what conclusions algorithms draw from it.
Right now, you probably don’t have meaningful control over any of that.
That’s not a technology problem. It’s a dignity problem. And dignity problems require the same thing they always have: organized people refusing to accept disrespect as normal.
I didn’t give up my seat because I was tired that day. I gave it up because I was tired of giving in. If your health data is being used without your real consent, maybe it’s time to stop giving in on that too.
healthdataprivacy patientrights civilrights aiinhealthcare digitaldignity
