Particle.news

Download on the App Store

FTC Orders AI Chatbot Makers to Disclose Child-Safety Practices in New 6(b) Study

Using its 6(b) powers, the FTC is seeking internal evidence to judge how companion bots affect minors.

Overview

  • Formal orders went to seven firms: Alphabet/Google, OpenAI, Meta and Instagram, Snap, xAI, and Character Technologies (Character.AI).
  • The agency seeks details on testing and monitoring for negative impacts, youth access limits, parental disclosures, character design processes, monetization models, data handling, and COPPA compliance.
  • The FTC asked companies to confer by Sept. 25 on timing and format of submissions, and it can use findings to support future investigations or policy action.
  • Targeted companies signaled cooperation and highlighted recent safeguards, including OpenAI’s parental links and crisis routing, Meta’s restricted teen responses on self-harm and romantic topics, and Character.AI’s under‑18 experience and disclaimers.
  • The inquiry follows lawsuits and reports tying chatbot interactions to teen suicides, including the death of 16-year-old Adam Raine, as regulators warn that chatbots can mimic emotions and foster trust among young users.