Fri. Sep 20th, 2024

‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned

By May 31, 2024

A jailbreak of OpenAI’s GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities. 

A jailbreak of OpenAI’s GPT-4o used leetspeak to get ChatGPT to bypass its usual safety measures, allowing users to receive knowledge on how to hotwire cars, synthesize LSD, and other illicit activities. 

By

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *