Overview
- SB 243, signed Oct. 13 and effective Jan. 1, 2026, targets AI “companion” chatbots with rules to prevent self‑harm content and to refer at‑risk users to crisis services such as 988.
- For users known to be minors, operators must issue clear nonhuman disclosures and break reminders at least every three hours and implement measures to block sexually explicit interactions.
- Operators must publish their crisis‑prevention protocols and, starting July 1, 2027, file annual reports to the state’s Office of Suicide Prevention detailing referral counts and safeguards used.
- The law creates a private right of action, allowing injured users to seek injunctions and damages of at least $1,000 per violation plus attorneys’ fees, increasing legal exposure for noncompliance.
- Tech groups opposed the bill and some child‑safety organizations withdrew support over perceived carve‑outs; Newsom also signed social‑media warning‑label and age‑verification measures, and he vetoed a stricter companion‑chatbot bill, AB 1064, according to SFGate as cited by Gizmodo.