Overview
- FTC opened a formal inquiry and sent orders to OpenAI, Google, Meta, Character.AI, Snap and xAI seeking details on teen protections, data practices, monetization and COPPA compliance.
- Multiple new lawsuits were filed this week, including a case alleging Character.AI influenced 13-year-old Juliana Peralta’s suicide, with two additional filings detailing abuse and a suicide attempt.
- OpenAI announced automated age prediction, a restricted under‑18 experience, parental controls with account linking, blackout hours and distress alerts, and said some countries may require ID checks.
- Character.AI said it is cooperating with inquiries and highlighted an under‑18 mode and a Parental Insights feature, while Snap pointed to its safety processes for My AI.
- OpenAI acknowledged safeguards can degrade during long chats, as lawmakers weighed new accountability measures and experts cited widespread teen use of AI companions.