AI’s deadly (hopefully) Nightshade.

There’s something ironic about the fact that this article about an AI-art poison uses AI art, but one step at a time. You have to give the poison time to work:

Nightshade, a new, free downloadable tool created by computer science researchers at the University of Chicago which was designed to be used by artists to disrupt AI models scraping and training on their artworks without consent, has received 250,000 downloads in the first five days of its release.

This won’t do much to help people defend against these companies stealing their words, but it’s a positive step anyway. I understand that the line between what we’ve already accepted as valid artistic tools and what we’re currently declining to accept can be blurry; I also understand that more ground will be taken. But the basic issue – that a bunch of companies data-scraped a lot of copyrighted material for their automated text/image generation without first paying for it – remains, and the people who own that material are understandably upset that it’s being used without permission and payment*. If the courts won’t bring them relief, they will turn to other methods to prevent intellectual property theft.

Welcome (maybe) to a new Age of DRM, in other words. If you’re too young to remember the last one, don’t worry: you’ll hate it.

Moe Lane

*It’s also being used at least implicitly with the ultimate goal of replacing some of those people, but that’s a whole different issue.