Researchers Use AI to Identify Apps With Risk of Sexual Abuse and Extortion
- Computer scientists created a website assessing social apps for reports of harassment and abuse using AI.
- The website found 1 in 5 social apps had complaints of child sexual abuse material.
- Apple and Google face criticism for distributing risky apps despite reports of abuse.
- Researchers believe Apple and Google should better inform parents and police apps enabling abuse.
- The new tool aims to identify apps with patterns of sexual extortion and abuse.