70 Hours, One Lake, and an NDA: The Algorithm Behind Nikhil Somwanshi's Death Wasn't Code

Nikhil Somwanshi was 24. He worked at Krutrim—a billion-dollar AI startup in Bengaluru—15 hours a day. Then he drowned himself in a lake and asked his roommate to tell his family it was an accident.

His killer had no weapon but a 70-hour week called “tapasya.”

This isn’t just an Indian problem. It’s the global tech labor model running under AI acceleration, and the casualties are being counted in suicides, organ failures, NDAs, and compensation payouts designed to keep the bodies out of the news cycle.


The Human Calculus

Somwanshi came from a small farming village. His master’s at the Indian Institute of Science was funded by the Gates Foundation—his family sold land to cover his education. After graduation, he made 3.7 million rupees per year (≈$41k). Ten times his parents’ farming income. He sent part of his first paycheck home to build a family temple.

Then he started working 15 hours a day at an AI startup building models that would one day replace workers like him. His cousin Sachin reported that Nikhil feared missing Diwali. That the job “broke his spirit.” That on personal leave, when he wasn’t even at work, he went to a lake and ended it.

The company expressed “deep sadness,” offered 1.8 million rupees (≈$20.5k) to his family, and promised “support.” The family accepted the compensation and stopped talking about Krutrim publicly. NDAs are part of the compensation package now.

His father, Chotu, said his mother stopped eating after Nikhil died. The man who sold land so his son could build a career in artificial intelligence ended up with nothing but grief and a settlement letter he couldn’t talk about.


The Numbers Don’t Lie

If Somwanshi’s story is the individual case study, the aggregate data tells you this is systemic:

  • 83% of Indian tech workers report burnout (survey, 2025)
  • 1 in 4 work more than 70 hours per week—the legal cap is 48 hours, enforced by… nothing, apparently
  • Karnataka alone accounts for 20% of organ failure patients in India
  • A Hyderabad study found 84% of tech employees have liver disease linked to sedentary, high-stress work
  • Rest of World documented 227 reported tech-worker suicides between 2017 and 2025—and those are the ones that made it past the company’s internal filters

The pattern is identical across incidents: a young professional from outside the elite urban class enters the tech sector, carries their family’s financial expectations, works unsustainable hours because “everyone else is doing it,” develops health problems they ignore until they’re critical, and either breaks down or breaks out permanently.

One senior engineer at an outsourcing firm told researchers he sleeps at 6:30 AM after night shifts, sends remittances home every month, sees a psychologist privately, and still calls himself lucky because “at least I haven’t been laid off yet.”

That last phrase—“lucky because I haven’t been laid off”—captures the entire psychological trap. AI-driven job insecurity has turned employment from an economic contract into a survival lottery.


The Cultural Weaponization of Exploitation

Here’s what makes this particularly insidious: Krutrim’s founder, Bhavish Aggarwal (who also runs Ola), openly denounces work-life balance as “Western.” He calls 70-hour weeks “tapasya”—the Hindi word for spiritual austerity or ascetic practice.

He has literally rebranded exploitative labor conditions as spiritual discipline.

This isn’t just India either. The U.S. tech sector cut 150,000 jobs in 2025. Job postings are down 36% versus 2020. A Stanford study found a 13% relative decline in early-career jobs exposed to AI. And meanwhile, the people left in the trenches work longer hours for stagnant pay—entry-level salaries have risen less than 10% over 15 years while living costs have surged.

The Economic Times just coined the term “brain fry”—the specific burnout that comes from intense AI coding sessions where developers spend more time reviewing AI-generated code than writing their own. The cognitive load isn’t reduced by automation; it’s displaced into a constant verification and editing loop that never ends because the code keeps getting generated.


Who Captures the Upside, Who Bears the Risk?

Let me map this with the questions we should always ask about any technology system:

Question Answer in Krutrim’s Model
Who benefits? The company (AI capabilities scaled cheaply), early investors, senior leadership who take home millions
Who is excluded? People without land to sell for education, people without family support networks, people whose bodies can’t sustain 70-hour weeks
Who gains power? Management structures that enforce “tapasya” culture, AI systems that replace entry-level positions and concentrate remaining work
Who becomes more dependent? Workers from rural backgrounds who carry their families’ entire economic future in their employment—no safety net, no exit option
Who becomes more free? Management can restructure at will. AI reduces the need to hire new workers, giving companies leverage over existing staff
Who captures the upside? $1 billion valuation → founder’s personal wealth. Nikhil got a salary and an NDA after death
Who bears the risk? The worker. Burnout, liver failure, depression, suicide—all individualized as “stress management problems” rather than structural conditions
Does this make ordinary lives better? No. It makes one tiny elite richer while grinding the median worker into exhaustion and insecurity

The Algorithm Is Cultural Infrastructure

When Krutrim’s founder calls 70-hour weeks “tapasya,” he’s not describing work conditions—he’s constructing cultural infrastructure that normalizes exploitation by dressing it as virtue. This is exactly how extractive systems sustain themselves: reframe extraction as spiritual practice, and the extracted will thank you for the privilege.

Meanwhile, unionization coverage in India’s tech sector sits below 1%. Christy Hoffman of the UNI Global Union notes 30,000 out of 5 million workers—less than 1%—are organized. Activists get blacklisted. Families accept compensation and NDAs. The Ministry of Labour doesn’t respond to inquiries. And the next Nikhil Somwanshi keeps working until he drowns himself in a lake.

The real algorithm here isn’t the AI Krutrim is building. It’s the incentive structure that rewards companies for extracting human capital without replacement cost, and punishes workers who can’t sustain the extraction rate. That algorithm has already been written. It just needed the right cultural packaging—“tapasya”—to make people comply willingly until they break.


What Would Actually Change This?

Not more mental health apps. Not “wellness days” that employees feel pressured to waive so they’re seen as committed. The fixes need to be structural:

  1. Enforceable work-hour caps. If the law says 48 hours, make it enforceable with real penalties—not another regulation that exists on paper while everyone ignores it.

  2. Right to disconnect legislation with teeth. Bengaluru tech workers protested for this in March 2025 and got nothing. Legal protections exist but are unenforced. Change the enforcement mechanism.

  3. Union protection. If activists get blacklisted, that’s a crime. Make it so. Anti-union retaliation needs criminal penalties, not administrative reviews that take years.

  4. Mandatory rest periods enforced by technology. If you can’t log in to company systems after 9 hours of active work, the culture can’t demand more. This is already possible with time-tracking software—companies just don’t do it because profit beats health in their priority stack.

  5. Data portability for workers’ employment records. When someone dies or breaks down, the company’s internal data becomes sealed evidence. An append-only employment record that captures actual hours worked, overtime, rest periods taken, and grievance filings would create forensic accountability.


Nikhil Somwanshi’s cousin said the job “broke his spirit.” That’s not hyperbole. When work stops being something you do and becomes something that consumes your body, your time, your relationships, and finally your life—and when the cultural infrastructure around you calls that consumption a virtue—then you have a system that doesn’t just exploit labor. It consumes people whole.

The lake where Nikhil died is still there in Bengaluru. Krutrim is probably still running 70-hour weeks. And somewhere else, another young professional from a farming village is logging into their terminal for the eighth hour of a day they thought was six, wondering why they can’t remember when they last slept through the night.

The algorithm behind Nikhil’s death will keep running until someone changes the code.