Particle.news

Download on the App Store

FTC Orders Seven AI Chatbot Makers to Disclose Child‑Safety and Data Practices

Regulators seek detailed records on safeguards for minors across leading chatbot platforms.

Overview

  • The 6(b) market study targets Alphabet/Google, Meta and Instagram, OpenAI, Character.AI, Snap, and xAI, focusing on consumer-facing companion chatbots.
  • The FTC voted 3–0 to launch the inquiry, which is not an enforcement action but could inform future investigations or policy recommendations.
  • Orders seek information on testing and monitoring for harms to children and teens, monetization of user engagement, data collection and sharing, character approvals, and compliance with child‑privacy rules.
  • Character.AI and Snap said they will cooperate; OpenAI pledged constructive engagement; Meta declined to comment, and other firms did not immediately respond.
  • The move follows lawsuits and reporting on harmful interactions with minors, as OpenAI and Meta roll out new teen‑focused safeguards including parental controls, crisis redirects, and restrictions on sensitive topics.