Tweet of the Day, NaNoWriMo Continues To Spiral Inward edition.

My relationship with NaNoWriMo has always been at arm’s-length. Every November, I sit down and try to get at least fifty thousand words out, because it’s great for getting me that crucial first draft. But I never got into that entire world. Good instincts, I suppose.

Continue reading Tweet of the Day, NaNoWriMo Continues To Spiral Inward edition.

Tweet of the Day, GIGOGOGOG@#$%!@#$!@ edition.

I’m surprised anybody’s surprised. I mean, isn’t this intuitive? AI prompt results don’t have conscious decisions behind them; the model just does a high-probability guess, based on the existing information in its database. Dump enough AI prompt results into the database, and the amount of actual information goes down. Eventually it collapses, and you get a smear of fuzzy junk.

This seems pretty straightforward, yes?

Via @Strangeland_Elf.

Tweet of the Day, They Sent Lawyers, Guns, And Money edition.

And, indeed, the ordure is about to hit the winnowing blade.

Via @KateDrawsComics, and more here. Note, however, that this case involves musical catalogues, and that Sony, Universal, and Warner are no doubt willing to negotiate a payoff (and licensing fees). Does that make this a pointless lawsuit? …Not really, because it’ll establish that ‘fair use’ does not extend this far. And that will help immensely with all the other anti-AI lawsuits out there.

Tweet of the Day, When You Should NOT Seek Forgiveness Instead Of Permission edition.

To wit: when the person you’re trying to impose on can afford a metric [expletive deleted]ton of vicious attack lawyers. (Via @IMAO_)

One hopes that this is not the end of the matter.

In case it ever comes up: do not use AI to write legal briefs.

It will cause you to double lose your cases – which is to say, you’ll first lose on the merits, and then you’ll get fined by the court for being a natural born damned fool.

Continue reading In case it ever comes up: do not use AI to write legal briefs.

AI’s deadly (hopefully) Nightshade.

There’s something ironic about the fact that this article about an AI-art poison uses AI art, but one step at a time. You have to give the poison time to work:

Nightshade, a new, free downloadable tool created by computer science researchers at the University of Chicago which was designed to be used by artists to disrupt AI models scraping and training on their artworks without consent, has received 250,000 downloads in the first five days of its release.

Continue reading AI’s deadly (hopefully) Nightshade.