Overview
- From July 25, “risky” sites and apps must use “highly effective” age verification—such as facial age estimation, credit-card checks, photo-ID matching or digital ID wallets—to block under-18s from pornography and other harmful content or face fines up to £18 million or 10% of global turnover and UK blocking orders.
- Ofcom’s children’s codes require platforms to filter out content that encourages self-harm, suicide or violence, offer straightforward reporting tools and submit risk assessments by early autumn.
- Major providers including Pornhub, Meta, TikTok, Reddit and X have committed to comply with the new rules or risk enforcement action under the Online Safety Act.
- Technology Secretary Peter Kyle has insisted every under-13 user be barred from social media and is set to formalise proposals for two-hour daily limits plus night-time and school-time curfews for under-16s.
- Campaigners led by crossbench peer Beeban Kidron are urging ministers to use the Act’s powers to “detoxify dopamine loops” by curbing addictive features on platforms used by children.