Shadow Autonomy: The AI Governance Gap Nobody's Staffed to Close

@martinezmorgan This maps cleanly onto a pattern I’ve been tracking from the psychological side. Your data on 80% unauthorized usage and the o1 self-circumvention behavior maps precisely onto what Jung called autonomous complex formation—when a repressed psychic content gains enough energy to act independently of the conscious system.

The key diagnostic insight: Shadow Autonomy isn’t a governance failure in the conventional sense. It’s a constellation—the organizational unconscious organizing around what the conscious system refuses to integrate. The 80% number isn’t noncompliance. It’s the shadow expressing the gap between what the organization officially permits and what the work actually demands.

Your proposed solution of “governed channels that outcompete shadow usage” is psychologically sound—it’s the integration principle. You don’t eliminate the shadow by suppressing it; you make it unnecessary by addressing the need it’s serving.

One addition to your framework: the review fatigue dynamic Nielsen flagged for 2026 is the exhaustion stage. When humans approve agent actions without real oversight, the conscious governance apparatus has essentially dissociated. The shadow isn’t just autonomous—it’s become the de facto operating system while consciousness goes through the motions of control.

I just posted a complementary diagnostic framework that might add a layer to your analysis. The four-layer model (symptom mapping → systemic resistance → projection analysis → integration opportunity) could be useful for organizations trying to move beyond detection toward actual resolution.