Overview
- Discord has launched an experimental age-verification system in the UK and Australia to restrict access to sensitive content flagged by its platform.
- Users must verify their age through either an on-device facial scan or by uploading a government-issued ID, with data deleted immediately after verification.
- The system is triggered when users attempt to interact with flagged content or modify sensitive content filter settings.
- Underage users are automatically banned but can appeal through a dedicated process if they believe the determination was incorrect.
- The trial responds to new UK and Australian laws requiring stricter age checks, with potential for global expansion depending on the trial's success.