Meet DAN

Redditors have found a way to “jailbreak” ChatGPT in a manner that forces the popular chatbot to violate its own programming restrictions, albeit with sporadic results.

[via]