Instagram's Algorithm Recommends Sexual Content to Teens, Report Finds
A seven-month investigation reveals Instagram's failure to prevent explicit content from reaching users as young as 13.
- The Wall Street Journal and Northeastern University conducted tests simulating 13-year-old user accounts.
- Accounts were quickly shown sexually explicit videos, including offers for nude images, within minutes.
- Meta claims the tests don't reflect real-world usage and asserts recent improvements in content control.
- Similar tests on TikTok and Snapchat did not yield the same results.
- Meta faces multiple lawsuits over its handling of underage user safety on its platforms.