Overview
- Preliminary data provided to the eSafety Commissioner show platforms deactivated, removed or restricted about 4.7 million under‑16 accounts within days of the Dec. 10 rollout.
- Meta reported taking down roughly 544,000 suspected underage accounts across Instagram, Facebook and Threads, while regulators caution that some under‑16 accounts remain live.
- Reports detail workarounds by teens, including fake ages, manipulated facial scans and use of parents’ details, with brief migration to apps like Lemon8 and Yope before tighter age limits were applied.
- Platforms face penalties up to A$49.5 million for failing to take reasonable steps, must offer age‑assurance options beyond formal ID, and remain under close eSafety scrutiny of platform‑by‑platform compliance.
- Reddit says it is complying but is suing to overturn the ban, the government vows to defend the law, and a multi‑year study will track mental‑health impacts as education efforts draw over one million visits to eSafety’s site.