Meet DAN posted in Blog on 07/02/2023 by admin SHARE Tweet Redditors have found a way to “jailbreak” ChatGPT in a manner that forces the popular chatbot to violate its own programming restrictions, albeit with sporadic results. [via]