University of Chicago Releases Nightshade, a Tool for Artists to Deter Unauthorized AI Use of Their Work
The tool makes subtle changes to images, invisible to the human eye, that can significantly alter how AI models interpret the images.
- Nightshade, a tool developed by the University of Chicago, allows artists to 'poison' their images to deter AI models from using them without permission.
- The tool makes subtle pixel-level changes to images that are invisible to the human eye but can drastically alter how an AI model interprets the image.
- Nightshade is designed to protect artists' intellectual property and discourage AI model trainers from disregarding copyright notices and do-not-scrape directives.
- The tool has been released for Windows PC and Apple Silicon Macs, and is available for artists to download and use on their artworks.
- Nightshade is the second tool from the University of Chicago team, following Glaze, a program designed to alter digital artwork to confuse AI training algorithms.