Overview
- New commentary in Lancet Global Health and a shared dataset by researcher Arsenii Alenichev document more than 100 AI-generated images used in global health and NGO outreach, dubbed “poverty porn 2.0.”
- Listings on major marketplaces such as Adobe Stock and Freepik feature photorealistic depictions of poverty and conflict that can be licensed, with platform executives describing their role as marketplace facilitators subject to legal compliance.
- The United Nations removed a video that blended real footage with AI-generated re-enactments of sexual violence after concerns over improper use, while Plan International says it has adopted guidance advising against using AI to depict individual children.
- Researchers and practitioners cite budget pressures, lower costs, and the absence of consent hurdles as key drivers pushing some organizations toward synthetic imagery instead of commissioned photography.
- Experts warn that AI images replicate racialized stereotypes and could contaminate future training data, and prior research indicates disclosure that visuals are synthetic can reduce donation intentions, prompting calls for transparency and support for local, consent-based photography.