Tool preventing AI mimicry cracked; artists wonder what’s next

Designed to help prevent style mimicry and even poison AI models to discourage data scraping without an artist’s consent or compensation, The Glaze Project’s tools are now in higher demand than ever. But just as Glaze’s userbase is spiking, a bigger priority for the Glaze Project has emerged: protecting users from attacks disabling Glaze’s protections—including attack methods exposed in June by online security researchers in Zurich, Switzerland.

Source: Tool preventing AI mimicry cracked; artists wonder what’s next

Get the latest RightsTech news and analysis delivered directly in your inbox every week
We respect your privacy.