Meta Introduces Stricter Safety Tools Amid Criticism and Legal Action
New measures aim to protect minors from online sexual exploitation, but the company faces a lawsuit accusing it of facilitating such abuse.
- Meta, the parent company of Facebook and Instagram, is introducing stricter safety tools to block children from receiving and sending nude images, and to prevent minors from receiving messages from strangers.
- These changes are in response to criticism from the government and police, as well as a rise in sexual offenses committed by children in England and Wales, and allegations of daily sexual harassment of teenage users on Meta's platforms.
- Meta's new safety tools will be available on both Facebook and Instagram, and will also be active in encrypted messages.
- Parents and guardians will have increased control over the safety and privacy settings of their teens’ Instagram accounts, with changes now requiring parental approval.
- New Mexico Attorney General Raúl Torrez has filed a complaint against Meta, accusing the company of facilitating and profiting off of the online solicitation, trafficking and sexual abuse of children.