Particle.news

Download on the App Store

Study Finds TikTok and Instagram Recommend Self-Harm Content to Teen Profiles

UK regulators say enforcement under new child‑safety codes is now under way.

Image
Image
Image

Overview

  • The Molly Rose Foundation reported that 97% of Instagram Reels and 96% of TikTok recommendations shown to its teen test accounts were judged harmful.
  • Researchers reviewed 300 Instagram Reels and 242 TikTok videos collected between November 2024 and March 2025 using dummy 15-year-old profiles that had interacted with such material.
  • On TikTok, 55% of the harmful recommendations referenced suicide or self-harm ideation and 16% referenced methods, with one in ten harmful videos receiving at least 1 million likes.
  • TikTok and Meta disputed the findings, citing teen-account protections and stating that over 99% of violative content is proactively removed.
  • Ofcom and Technology Secretary Peter Kyle said new codes now require services to curb harmful recommender systems, with 45 sites under investigation, while a separate Children's Commissioner report highlighted rising exposure of young people to violent pornography online.