This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
doctorn, And then word gets out through people writing these news posts, which has then somebody make a script that ignores those pixels and we’ll be back where we started… 😅
LollerCorleone, Yeah, its going to be a never-ending game of whack-a-mole.
Add comment