Overview
- Preliminary data show about 4.7 million accounts believed to belong to under‑16s were deactivated or restricted shortly after enforcement began on December 10.
- Ten major platforms reported figures to the eSafety Commissioner, with Meta saying it removed roughly 544,000 underage accounts across Instagram, Facebook and Threads.
- The law requires platforms to take reasonable steps to block under‑16s or risk fines up to A$49.5 million, with enforcement targeting companies rather than children or parents.
- Officials say some underage accounts remain active, and they are monitoring circumvention; downloads of alternative apps and VPNs spiked early but did not translate into sustained use, according to the regulator.
- Industry compliance is ongoing, Meta is urging app‑store level age checks, Reddit has mounted a legal challenge, and eSafety plans further measures and long‑term impact studies as other countries weigh similar rules.