Overview
- Ryde Police Area Command opened an investigation after several families reported at Eastwood Police Station that explicit AI‑altered images using faces of female students were circulating online.
- The ABC reports a male student received one image and told the school, and the NSW Department of Education says the school is working with police and will impose strong discipline if students are found responsible.
- eSafety Commissioner Julie Inman Grant confirmed her office is coordinating with NSW Police and the education department and says deepfake image‑based abuse in Australia has doubled in 18 months, with at least one school incident each week.
- Researchers and officials say women and girls are overwhelmingly targeted, and the NSW government recently made creating AI intimate images without consent a crime punishable by up to three years in prison.
- Federal Attorney‑General Michelle Rowland says producing AI‑generated child sexual material can carry penalties of up to 15 years, while regulators report high removal rates and are moving against ‘nudify’ services with international partners.