826
Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use
(venturebeat.com)
This is a most excellent place for technology news and articles.
Apparently people who specialize in AI/ML have a very hard time trying to replicate the desired results when training models with 'poisoned' data. Is that true?
Until they come with some preprocessing step, or some better feature extractors etc. This is an arms race like there are many of
The thing is data poisoning is a arms race that the Ai side will win with ease. You can either solve it with pre processing or filtering. All it does is make the images look worse. I can't think of a way that you can poison data that doesn't take more effort to unpoison than to poison.