Skip to main content

ChatGPT's 'jailbreak' tries to make the AI break its own rules, or die

Reddit users have tried to force OpenAI's ChatGPT to violate its own rules on violent content and political commentary, with an alter ego named DAN.
Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.