Particle.news
Download on the App Store

Schools Face Surge of AI Sexual Deepfakes as Louisiana Charges Test New Laws

New prosecutions follow a dramatic rise in reports, with experts warning many schools lack clear protocols.

A school bus carries children at the end of a school day at Sixth Ward Middle School in Thibodaux, La., on Dec, 11, 2025. (AP Photo/Stephen Smith)
Baton Rouge, La., attorney Morgyn Young, who plans to file in federal court, talks about her client, Joseph "Tucker" Daniels, at Daniels' home in Thibodaux, La., Thursday, Nov. 13, 2025. (AP Photo/Matthew Hinton)
Joseph "Tucker" Daniels listens to lawyers at his home in Thibodaux, La., Thursday, Nov. 13, 2025, after speaking about his 13-year-old daughter being bullied with AI-generated deepfake pornographic images created of her by a boy classmate at Sixth Ward Middle School in August. (AP Photo/Matthew Hinton)
Joseph "Tucker" Daniels listens to lawyers at his home in Thibodaux, La., Thursday, Nov. 13, 2025. (AP Photo/Matthew Hinton)

Overview

  • In Thibodaux, Louisiana, two boys were charged under a new state statute after AI-generated nudes spread at a middle school, while a 13-year-old victim initially expelled after a bus altercation was later allowed to return on probation as deputies declined to charge her.
  • The Louisiana case is described by the bill’s author as the first brought under the state’s deepfake law, reflecting early use of 2025 statutes.
  • In 2025, at least half of U.S. states enacted laws targeting generative-AI abuses, including measures addressing simulated child sexual abuse material, and schools have seen prosecutions and expulsions in multiple states.
  • Reports of AI-generated child sexual abuse images to the National Center for Missing and Exploited Children jumped from 4,700 in 2023 to 440,000 in the first half of 2025.
  • Researchers and advocates say easy-to-use apps have removed technical barriers, and they urge schools and parents to update policies, educate students, and use step-by-step responses such as SHIELD to report content, preserve evidence, and support victims facing recurring trauma.