Particle.news

Download on the App Store

Major Social Media Platforms Fail to Address Suicide and Self-Harm Content

Study reveals inadequate content moderation by Facebook, Instagram, Snapchat, and X, urging stronger regulation.

  • Research by the Molly Rose Foundation found over 95% of harmful content was removed by Pinterest and TikTok alone.
  • Meta's platforms, Facebook and Instagram, detected only 1% of suicide and self-harm content in the study.
  • Elon Musk's X and Snapchat flagged even less, with X responsible for just 0.14% of moderation decisions.
  • The Foundation calls for a strengthened Online Safety Act to protect children from preventable harm.
  • Parents and regulators demand more assertive action from tech companies to safeguard young users.
Hero image