Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

But even without filing lawsuits, artists have a chance to fight back against AI using tech. MIT Technology Review got an exclusive look at a new open source tool still in development called Nightshade, which can be added by artists to their imagery before they upload it to the web, altering pixels in a way invisible to the human eye, but that “poisons” the art for any AI models seeking to train on it.

Excellent. This is exactly the kind of clever thinking we need to stop major corporations from stealing everyone’s creative works for their own further gain. I hope we can develop these poisons further, to the point of making these “AI” tools entirely useless.

Get permission, or get poisoned.

7 Comments

  1. 2023-10-26 9:21 pm
    • 2023-10-27 2:53 am
  2. 2023-10-27 2:49 am
    • 2023-10-27 3:55 am
      • 2023-10-27 10:53 am
        • 2023-10-27 1:24 pm
  3. 2023-10-27 5:07 pm