Particle.news

Download on the App Store

Charity Condemns Meta's Reduced Safety Measures on Harmful Content

The Molly Rose Foundation urges stricter regulation after Meta scales back content moderation policies, citing risks to children's safety.

  • Meta announced changes to its content moderation policies, reducing automated scanning for certain harmful posts and relying more on user reports.
  • The Molly Rose Foundation, established after Molly Russell's 2017 death linked to harmful social media content, criticized the move as regressive and dangerous for children.
  • The foundation called on UK regulator Ofcom to strengthen the Online Safety Act, ensuring proactive scanning for all harmful content and safeguarding children from exposure.
  • Meta stated it continues to prohibit and remove content promoting suicide, self-harm, and eating disorders, maintaining its community standards and safety measures for teen accounts.
  • Ofcom emphasized its commitment to holding tech companies accountable under the Online Safety Act, with plans to consult on enhanced automated moderation requirements.
Hero image