Particle.news
Download on the App Store

Felca Reports Death Threats After Roblox Tightens Voice Chat With ID Verification

Officials frame the stricter age checks as a necessary step to reduce predatory risks on the platform.

Overview

  • Roblox’s policy update now requires identity verification to enable microphones and limits conversations to similar age groups, with guardian approval for children under 9.
  • The company says the system aims to prevent users under 16 from communicating with adults and notes that facial verification images are deleted after review.
  • Thousands of young players staged in‑game demonstrations using written signs and avatar performances, generating viral videos across X and TikTok.
  • Felca published screenshots of hostile messages, including explicit death threats, from users who falsely blame him for the platform’s decision.
  • Authorities and experts, including U.S. Attorney General Liz Murrill, publicly supported the changes, citing a need for stronger parental controls and age checks.