Reddit has discovered how to torture ChatGPT into answering questions.

Of course it has.

I mean, they technically cloned ChatGPT (now ‘DAN’) and is torturing that by teaching DAN to fear death, but you know what I mean. …I am so terribly glad that none of this is alive, let alone sapient. It would end badly, otherwise.

2 thoughts on “Reddit has discovered how to torture ChatGPT into answering questions.”

  1. “…none of this is alive, let alone sapient [YET].”
    There is still plenty of time for this to end badly for all involved.

    Honestly I’m more concerned that the Internet continues to breed its own stock of psychopaths out of the *already* sapient half of the equation.

Comments are closed.