Particle.news
Download on the App Store

Roblox Starts Facial Age Checks for Chat With Global Enforcement by January 2026

The phased rollout follows mounting legal scrutiny over child safety on the platform.

Overview

  • Roblox has begun a voluntary in‑app age check for chat that becomes mandatory in Australia, New Zealand and the Netherlands in early December, with enforcement expanding to other markets by early January 2026.
  • Verified users are placed into six age bands—under 9, 9–12, 13–15, 16–17, 18–20 and 21+—that restrict messaging across distant age groups, with limited exceptions for approved Trusted Connections.
  • Age estimation uses live selfie video or images processed by vendor Persona, which Roblox says are deleted after processing, and users over 13 can correct misclassification with government ID or parental verification.
  • Accounts for children under 9 have chat turned off by default unless a parent enables it after verification, and private chats are not end‑to‑end encrypted so Roblox can continue moderation for harms like grooming.
  • The company cites lawsuits and investigations alleging failures to protect minors as a driver for the change, while experts and advocates raise concerns about accuracy, bias, data risks and possible workarounds despite fraud checks and reported 1–2 year error margins.