Overview
- The IWF processed 312,030 confirmed reports of child sexual abuse material in 2025, a 7% increase from 2024.
- Of the AI‑generated videos identified, 2,230 were classified as Category A and 1,020 as Category B under UK law.
- X restricted its Grok tool from editing images of real people in revealing clothing after reports of sexualized depictions of women and children, and Ofcom said its investigation continues.
- Advances in video generators and widely available open‑source tools lowered barriers for offenders, with some abusive depictions reportedly created with Grok and then refined using less‑restricted systems.
- Major labs say they deploy safeguards and have joined prevention initiatives, with OpenAI reporting over 75,000 child‑safety incidents to NCMEC in early 2025, yet detection and prosecution remain challenging and the true scale is likely undercounted.