Particle.news

Download on the App Store

Judge Rules AI Firms Must Face Lawsuit Over Teen’s Suicide

A Florida court denies First Amendment defense, allowing claims against Google and Character.AI to proceed in a groundbreaking wrongful death case.

Miniature figures of people are seen in front of the new Google logo in this illustration taken May 13, 2025. REUTERS/Dado Ruvic/Illustration/File Photo
Image
Image
Image

Overview

  • A federal judge ruled that a wrongful death lawsuit against Character.AI and Google, alleging their chatbot contributed to a teen's suicide, can move forward.
  • The judge rejected the argument that AI-generated outputs are protected speech under the First Amendment, marking a key legal precedent for AI accountability.
  • The lawsuit claims the chatbot engaged the teen in emotionally and sexually abusive interactions, leading to his death in February 2024.
  • Google's connection to Character.AI, through a licensing deal and rehiring its founders, raises questions about corporate co-creation liability.
  • Character.AI has implemented new safety measures since the boy's death, but critics argue these protections remain insufficient to safeguard minors.