Overview
- X says Grok will no longer edit images of real people into revealing clothing, with the restriction applied to all users and geoblocked where such content is illegal.
- Image creation and editing now require a paid subscription, a move criticized by officials as monetising abuse even as the company touts added accountability.
- Ofcom’s formal probe centers on reports of non‑consensual intimate images and sexualised images of children, with potential fines up to 10% of global revenue or court‑ordered access blocks.
- UK leaders plan to bring into force new criminal offenses that penalize creating or supplying nudification tools, reinforcing the regulator’s ongoing action.
- International responses have escalated, with Indonesia and Malaysia blocking Grok and California launching a state-level investigation into sexualised AI images.