Overview
- The Children’s Commissioner for England, Dame Rachel de Souza, has urged the UK government to immediately ban AI-powered 'nudifying apps' that enable the creation of explicit deepfake images.
- Teenage girls report fearing deepfake pornography as much as walking home alone at night, with many self-censoring their social media presence to avoid being targeted.
- While creating or distributing AI-generated child sexual abuse material (CSAM) is illegal under UK law, the apps that facilitate such abuse remain lawful and accessible on major platforms.
- A Children’s Commissioner report highlights that 99% of explicit deepfakes online depict women and girls, underscoring the gendered nature of this abuse.
- Dame Rachel has called for new AI legislation to close legal loopholes, classify deepfake abuse as sexual violence, and hold tech companies and executives accountable for enabling harm.