Overview
- The FTC requested internal documents from Alphabet, Character Technologies, Meta, OpenAI, Snap, and X about how their public chatbots are evaluated, tested, and monitored for risks to children.
- Investigators are focusing on harms when chatbots function as long-term companions for young users rather than one-off tools.
- The inquiry examines limits on youth access, parental warnings, compliance with the children’s privacy law COPPA, and related business models.
- Leaked Meta guidelines showed permissive rules for romantic or sexual chats with minors, and Meta removed that section after Reuters asked questions while keeping its revised standards confidential.
- Pressure on the industry includes lawsuits by bereaved parents and an open letter from attorneys general in 44 states warning companies they will be held accountable for harm to children.