Overview
- Meta says it has placed hundreds of millions of users into Teen Accounts and is now rolling the experience out globally on Facebook and Messenger, including new placements starting this week in Pakistan.
- Teen Accounts default to tighter privacy and safety settings that limit who can contact teens, restrict tags and comments, nudge breaks after an hour, enable overnight quiet mode, and require parental approval for some changes, with under‑16s needing consent to opt out.
- Meta is opening a School Partnership Program to all U.S. middle and high schools to prioritize reports of bullying and safety concerns for review the company says it aims to complete within 48 hours.
- A report led by former Meta engineer Arturo Béjar and child‑safety groups found only 8 of 47 tested Instagram protections worked effectively, citing easy workarounds, exposure to self‑harm and sexual content, and weak reporting flows for inappropriate contact.
- Reuters validated several vulnerabilities in its own tests and, citing internal documents, reported lapses in automated detection for eating‑disorder and self‑harm content, while Meta disputed the findings and said protected teens saw less sensitive content, fewer unwanted contacts, and reduced night use.