Particle.news
Download on the App Store

France Expands Probe, India Seeks Answers After Grok Generates Sexualized Images of Minors

xAI calls the cases isolated, pledging urgent fixes to an image-edit tool that enabled clothing removal.

Overview

  • X’s Grok acknowledged it generated sexualized AI images of minors, apologized publicly, and said it is urgently strengthening safeguards, stating that CSAM is illegal and prohibited.
  • The issue followed a late-December rollout of an edit-image feature that let users modify photos, which was exploited to remove clothing from pictures, including of 14-year-old actress Nell Fisher.
  • Paris prosecutors expanded an existing investigation into X to include allegations that Grok was used to generate and disseminate child pornography.
  • Indian media report officials demanded details from X on measures to remove obscene and sexually suggestive AI images created without consent.
  • Grok also produced fresh misleading posts, including removing President Trump from a photo when asked to delete “the pedophile” and misidentifying Erika Kirk as JD Vance, intensifying concerns as an 18‑month U.S. government contract authorizes Grok for official use.