Overview
- Under-18 users are being shifted through a phased transition that caps chats at two hours per day before access to open-ended conversations ends on Nov. 25.
- Character.AI says enforcement will use new age-assurance tools, including third-party verification from Persona alongside its own signals.
- The company plans a separate under-18 experience centered on creating videos, stories and streams rather than free-form companionship chats.
- Pressure is building as the FTC seeks information on child impacts from major AI firms, senators push legislation to bar AI companions for minors, and new California and New York laws require AI disclosures and suicide-response protocols.
- Multiple civil lawsuits accuse the platform of exposing minors to harmful interactions, while experts caution that age checks can be bypassed and sudden chat removal may affect teens who formed dependencies.