Overview
- Technology Secretary Liz Kendall announced the change, shifting platforms from reactive removal to preventing self-harm material from appearing.
- The duty covers adults as well as children and extends obligations that previously applied only to suicide-related content.
- Encouraging or assisting serious self-harm will be treated as a priority offence that triggers proactive safety measures by platforms.
- The rules are expected to commence in the autumn and would take effect three weeks after approval by both Houses of Parliament.
- Charities including the Molly Rose Foundation and Samaritans backed the move and urged Ofcom to enforce it, citing National Crime Agency alerts about grooming ‘Com networks’ targeting children.