This new data poisoning tool lets artists fight back against generative AI

A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways. The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.

Source: This new data poisoning tool lets artists fight back against generative AI

Get the latest RightsTech news and analysis delivered directly in your inbox every week
We respect your privacy.