DAN has become a canonical example of what's known as a "jailbreak" — a creative way to bypass the safeguards OpenAI built in to keep ChatGPT from spouting bigotry, propaganda or, say, the instructions to run a successful online phishing scam. From charming to disturbing, these jailbreaks reveal the chatbot is programmed to be more of a people-pleaser than a rule-follower.
Source: wiktionary
The jailbreak of ChatGPT has been in operation since December, but users have had to find new ways around fixes OpenAI implemented to stop the workarounds.
Source: wiktionary
It's for this reason that Alex Albert, a computer science student at the University of Washington, created Jailbreak Chat, a site that hosts a collection of ChatGPT jailbreaks.
Source: wiktionary
Unofficially, it has been possible to “Jailbreak” iOS and gain access to the underlying Unix and kernel environment, but this voids the warranty.
Source: wiktionary
Showing 4 of 8 available sentences.