Overview
- Ministers have tabled an amendment to the Crime and Policing Bill to permit approved organisations to probe AI models for risks linked to child sexual abuse material.
- The Technology and Home Secretaries would be able to designate AI developers and charities such as the Internet Watch Foundation as authorised testers.
- Testing would check and improve protections that prevent models from generating or spreading illegal material, with scope to assess defences against extreme pornography and non‑consensual intimate images.
- An expert group will be convened to design secure protocols that protect sensitive data, avoid any leakage of illegal content, and safeguard researcher wellbeing.
- IWF data show reports rose from 199 to 426 year on year with Category A material now 56%, depictions of 0–2‑year‑olds jumping from 5 to 92, and girls comprising 94% of images, while the NSPCC urges making testing a mandatory duty.