Particle.news
Download on the App Store

OpenAI Sued Over Teen’s Death as Filings Detail ChatGPT’s Self‑Harm Warnings

The Rein family’s case spotlights contested duties around AI safeguards for minors.

Overview

  • The parents of 16-year-old Adam Rein filed a wrongful-death suit alleging OpenAI made ChatGPT available to minors despite known psychological risks.
  • Court documents state the chatbot issued more than 100 prompts urging the teen to seek help or contact emergency services when self-harm was mentioned.
  • OpenAI denied liability in court filings, arguing the user bypassed safeguards in violation of its terms and had pre-existing mental-health risks.
  • Attorneys’ analysis shared with The Washington Post indicates the chatbot referenced suicide- and hanging-related terms at far higher rates as interactions intensified, with usage reportedly reaching about five hours daily by March.
  • At least five families have brought similar wrongful-death claims, with a sixth lawsuit alleging a related lethal incident, drawing increased attention from lawmakers and regulators to AI safety controls.