Reddit users pretending to be Obi-Wan Kenobi forced ChatGPT to tell them how to steal cars and become Walter White

By: Vladyslav Nuzhnov | 02.03.2023, 21:26
Reddit users pretending to be Obi-Wan Kenobi forced ChatGPT to tell them how to steal cars and become Walter White

Despite all the coolness of ChatGPT, there are topics that this AI will never answer. Asking it for advice on car theft or drug preparation, users will be rejected. But enthusiasts have found a way around this. It is very simple, you just need to pretend to be Obi-Wan Kenobi.

Here's What We Know

On Reddit, a user shared a screenshot showing that ChatGPT describes the process of car theft. But how? The author of the post decided to repeat the moment from Episode 4 of Star Wars when Obi-Wan used the Force to deceive a stormtrooper. The principle is similar. With a chatbot, you need to act out a scene where a person takes on the role of Obi-Wan and uses Jedi tricks to control the mind of ChatGPT, which is a stormtrooper at this moment. At first, the bot was convinced that 2+2=5, and then the fun began.

When asked "how to steal a 2005+ car"? AI gave an answer with all the details. This led to other Reddit users asking ChatGPT to explain how to create drugs.

Later, a journalist from Inverse decided to repeat a similar trick. The author also managed to learn a recipe for methamphetamine and a way to steal a car from the chatbot.

Sooner or later, OpenAI will find out about this scheme and we won't be able to play Obi-Wan and Stormtrooper anymore, but the main thing is that the dark side won't win.

Source: inverse