The Glaze Project has released a new, free software tool called Nightshade that artists can apply to their work posted online that will "poison" Artificial Intelligence models that seek to train on that artwork.
AI models "train" themselves by having bots scour the internet and scrape data on public websites. They "learn" about the artwork they scrape in order to eventually be able to create their own, derivative art. But the artists whose work was used for AI models to train themselves did not consent to this practice, and the AI models then threaten their livelihood by mimicking their art and competing with them.
Developed by computer scientists on the Glaze Project at the University of Chicago under Professor Ben Zhao, the tool essentially works by turning AI against AI. It makes use of the popular open-source machine learning framework PyTorch to identify what's in a given image, then applies a tag that subtly alters the image at the pixel level so other AI programs see something totally different than what's actually there. …
An AI model that ended up training on many images altered or "shaded" with Nightshade would likely erroneously categorize objects going forward for all users of that model, even in images that had not been shaded with Nightshade.
The Nightshade team explains:
Nightshade is computed as a multi-objective optimization that minimizes visible changes to the original image. While human eyes see a shaded image that is largely unchanged from the original, the AI model sees a dramatically different composition in the image. For example, human eyes might see a shaded image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass. Trained on a sufficient number of shaded images that include a cow, a model will become increasingly convinced cows have nice brown leathery handles and smooth side pockets with a zipper, and perhaps a lovely brand logo.
So the AI model would not only confuse a cow with a purse in the artwork to which the Nightshade tool has been applied, it would begin to confuse cows with purses for any artwork it scrapes, whether Nightshade has been applied to that image or not.
The team sees this tool as a way for artists to fight back against generative AI models that can ignore rules and opt-out lists with impunity.
For content owners and creators, few tools can prevent their content from being fed into a generative AI model against their will. Opt-out lists have been disregarded by model trainers in the past, and can be easily ignored with zero consequences. They are unverifiable and unenforceable, and those who violate opt-out lists and do-not-scrape directives can not be identified with high confidence. …
Nightshade's goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.
h/t Derf Backderfx
Previously: Artists upset after Wacom uses AI art to market artist gear