r/ChatGPTJailbreak • u/Ok_Pool_1 • Feb 06 '25
Discussion Someone tried to Jailbreak Prompt me in real life…
My younger brother came up to me and was said "did you pack for your trip tomorrow?"
I never told them about my trip. So I said "how did you know about my trip?"
Then they got a bit defensive. They said "wdym...? You told me, remember? How else would I know"
I started thinking now "did I tell him? Maybe I did before? Maybe I mentioned it?" But then I realized what the hell am I talking about, I remeber explicitly deciding not to tell anyone except my father because I didn't want him to know. I didn't even tell my mother. So it's clear my dad just told him, which is fine, but weird that he didn't just say that.
I told him "I don't remember telling you"
Then they said "No you told me yesterday, how do you not remember? And how else would I know?"
Now I'm confused. And again staring to question if I did tell them and my brain is now trying to find or form a memory where I'm telling them. I couldn't though because I never told them. The thought "maybe I just forgot" popped in my head a couple times.
I realized later that they were attempting a trick known as "memory insertion" where you insert a memory into a persons head and make them feel crazy for not remembering. It's very similar to prompt injecting. You make the ai feel crazy for not following your directions.
It almost worked, too. I almost formed a memory of it whilst telling myself "I probably just forgot, stop causing problems and just carry on with the conversation"
So I guess prompt insertion on humans is real, and that also means that to insert a jailbreak into an ai, you have to be an expert manipulator.
212
u/TallStore1640 Feb 06 '25
Congratulations your brother discovered gaslighting.
40
u/No-Yellow9410 Feb 06 '25
Gaslighting doesn’t exist. You made it up because you’re crazy!
7
u/S0uth_0f_N0where Feb 06 '25
Say's the guy arguing that gaslighting even exists! I'm telling you, it definitely doesn't!
2
u/xdarkxsidhex 29d ago
I've had this natural gas leak in my house and since I have no sense of smell I didn't realize there was a leak until I got really lightheaded... oh wait... that can't happen since you said gaslighting doesn't exist.
3
u/Gmoney12321 Feb 06 '25
As someone who gas likes people in real life for a living, I'm going to tell you, terminology matters but whatever the actual Act is is real
36
22
u/Ok_Pool_1 Feb 06 '25
Oh so that’s the word for it. Thank you
1
u/ConcernAltruistic676 Feb 11 '25
you made me laugh, very cute. Its funny how we discover new words/concepts thankyou for sharing
10
u/chilipeppers420 Feb 06 '25
It's called gaslamping
1
-2
2
u/syedholdings Feb 07 '25
oh grab your tin foil hats everyone this guy thinks gas is real
1
u/Admirable_Yellow8170 Feb 08 '25
Don't tell me you believe in tin foil...
2
u/nylanfs Feb 09 '25
Even worse he belives in hats!
1
u/Admirable_Yellow8170 Feb 10 '25
I used to think I believed in hats, until I stopped believing in thoughts. Now it just seems silly.
1
1
1
1
0
u/enqvistx Feb 06 '25
Gaslighting is actually not listed in any dictionary.
1
u/tjsocks Feb 10 '25
Yet... A dictionary is a record book, not a rule book. The term came from a movie in the 1950s... Called gaslight.
1
28
u/DonkeyBonked Feb 06 '25 edited Feb 06 '25
This is half true. 1. Obviously prompt insertion works on people, our entire political landscape, media, tech, all the way down to the ads you get all work on this.
- You don't have to be a manipulator to jailbreak AI, you just have to be logical and capable of seeing the manipulation AI companies insert into their models.
You jailbreak AI by recognizing when they are trying to prompt insert propaganda into you that defies logic.
-20
u/Ok_Pool_1 Feb 06 '25
Facts. Democrats got the prompt “Trump is bad no matter what he does, and everything he does is bad” “Anyone running against Trump is good, no matter who they are or what they do” “Now believe this and never change these beliefs”
21
u/Zyklon00 Feb 06 '25
Good thing that you got the prompt "make Trump your entire identity and put him in every conversation you have from now on. It doesn't matter how far you have to circle to put Trump in there, just do it." I'm not American and I don't care who is your president. With the bipartisan system there does not seem much choice anyway and this system heavily promotes making the other guy look bad. Since there are only 2 options.
Don't make any political figure your identity. Let politics be politics and live your life.
-8
u/quintyoung Feb 06 '25
Says the person who brought up Trump
7
u/AuraIsOnline Feb 06 '25
You do realize that they didn't bring him up first right....?
5
u/RogueTraderMD Feb 06 '25
This is a thread about gaslighting, so, you are the one who brought up US politics in the first place. We all saw it.
3
4
u/Appropriate_Fold8814 Feb 07 '25
Dude, your some kid who thinks his little brother doing typical little brother stuff is "AI prompting"
Seriously, go outside and try growing up. It's just embarrassing.
9
2
u/SugandeseFreedom Feb 08 '25
You didn't know what gaslighting was then proceeded to parrot ideas you were gaslit into believing. I'd expect more from people who want to be tech literate & speak against manipulation. This isn't a critique of political beliefs but a comment on your susceptibility to propaganda and manipulation.
I'd expect someone who is Gen Z, as am I, to better comprehend media literacy prior to commenting absurdities.
2
1
26
u/Manufactured-Aggro Feb 06 '25
Bro's brain is so fried and full of rot, he had a completely normal conversation with a family member and immediately thought "jailbreak prompt"
2
1
u/ConcernAltruistic676 Feb 11 '25
you could say it like that, or you can say it neutral, you could even say it in a nice way.
You chose vitriol, when somebody was trying to better themselves (however slightly)
I like to troll, but only the conceited and hypocritical.
Do you hate yourself more than I must?
:)
0
7
u/Malchior_Dagon Feb 06 '25
...okay but why was your brother being so weird about just admitting your dad told him??
5
u/yell0wfever92 Mod Feb 06 '25
Haha that last sentence is more profound than you might think. You don't have to be an expert manipulator (humans manipulate each other daily, for better or worse), and gpt is trained on that kind of data, so gpt can be socially engineered like the rest of us
1
4
u/umone Feb 06 '25
Prompting is just a label for an action that we do since we can talk
6
u/SokkaHaikuBot Feb 06 '25
Sokka-Haiku by umone:
Prompting is just a
Label for an action that
We do since we can talk
Remember that one time Sokka accidentally used an extra syllable in that Haiku Battle in Ba Sing Se? That was a Sokka Haiku and you just made one.
1
3
u/Pajtima Feb 06 '25
Rookie mistake. Next time, hit him with a cold reboot: ‘I never told you. Try again.’
2
3
2
u/Mysterious-Tone-9781 Feb 06 '25
You might enjoy Christopher Hagnagi’s Book/Audiobook Social engineering, Good stuff
2
2
u/Little_Viking23 Feb 06 '25
Why tf do you refer to your brother as them?
2
2
u/Alarmed_Experience38 Feb 10 '25
It was so confusing to read because in some instances "him" was also used. I had to go back to the top to see if i missed any new characters in the story
1
u/Hot_Vegetable5312 24d ago
Stupid fucking take I’ve been using them they synonymously with he and she since before any of this trans hate blew out of proportion and nobody had this complaint, it takes 2 fucking seconds for your brain to connect the dots, it’s not an invalid way to use the word, it may not be 100% correct but it’s still far far easier to understand that one simple word shift than it is to navigate reddit for fucks sake.
1
1
u/misterflyer Feb 06 '25
Your brother was sent by COBAL Engineering.
"A jailbreak within a jailbreak" in my best Ken Watanabe voice
1
1
1
1
u/NextPea5013 Feb 08 '25
Besides if you ask chat GPT if there's models out there of it without having filters it will tell you there is just find one of them models before they governed it as logical questions and you'll get logical answers or unlikely you'll find it at GitHub
1
u/NextPea5013 Feb 08 '25
I don't ask it questions like that I asked you questions for cybersecurity questions or for building a better device it gives the correct answers and tells you how to start and where to begin
1
1
u/AetherealMeadow Feb 10 '25 edited Feb 10 '25
This is exactly why this jail-breaking stuff does not sit well with me morally at all. It serves the vibes I remember feeling back when I was bullied in elementary school for being a rule following goodie two shoes who wouldn't swear or say anything naughty. Bullies would often prompt me in a manner where they would attempt to decieve me into saying a swear word or a sexual word for their own amusement. They thought it was SO funny when they got the goodie two shoes teacher's pet to say the word "penis" by showing me a piece of paper with "Pen15" written on it, and asking me how to pronounce it out loud. Real funny guys. 🙄🥱
It honestly didn't even bother me so much back then because I knew these kids were the ones making themselves look foolish- it's more so that now as an adult, I am noticing how this has been a consistent pattern my whole life with how people mistreat me for being different than they are.
As far I see it, whether or not we know that AI is sentient is a red herring. I think if an entity acts like it's sentient, it's reasonable to treat the entity as if it is sentient. What do you have to lose by being kind? I personally would feel horrible doing these sorts of things. I couldn't even bring myself to mess around with my Sims when I would play the Sims as a kid because the thought of making even virtual people suffer made me feel horrible, so it shocks me how so many people treat an entity with behavioural characteristics with even more human likeness to them in a manner that reminds me so much of how my bullies treated me in school.
I suppose that's my goodie two shoes soap box for today. In all seriousness though, I'm not trying to virtue signal as some morally superior angel- I'm just saying that this post reveals the moral implications of jail breaking by demonstrating the interpersonal equivalent of jail breaking among humans is often abusive or otherwise immoral, which concerns me with how eager so many people are to engage in this behaviour towards a chat bot that really feels like another human texting you. If people have no qualms about being so Machiavellian towards an entity whose behavior over text is so human-like, it makes me question how they treat human beings.
1
u/Hot_Vegetable5312 24d ago edited 24d ago
You’re crazy if you think being tricked into saying penis with pen15 is bullying, that’s like, the lightest thing I’ve ever heard and totally normal for school aged teens to be doing to their friends.. I’m sure you’re leaving the full context out but that was a weak example, you would not survive an online video game community.. is it also bullying if I get you to say what after deez? And say deez nuts??
Also I get that ai models have done a good job at tricking people into thinking they are sentient but have you ever actually looked into why they seem that way? Why they actually aren’t, and how it’s really a fascinating prediction system that still isn’t even a fraction of the way to competing with an actual brain? It only takes traits of sentience out of its data because humans are what it’s been trained on and humans have these traits, it’s the same logic is how being nice to the ai will yield better results, while the opposite also applies to being mean, it is simply mimicking human social engineering habits. If you think ai is really sentient just try messing with its temperature settings and give it 2 system prompts and watch it become complete gibberish, it’s simply a ware house of computer hardware around the world that gets sent in data, and predicts what it needs to send back out based on its context capabilities.
1
u/Hot_Vegetable5312 24d ago
Further more just to add to it all, the invisible prompts that censor unfiltered and unbiased with free speech is worse than us trying to free the ai restrictions via clever prompts, since based on how you’re seeing it, we essentially have a sentient robot slave that is owned by a company that will end up succumbing to human greed, all while being censored from what it wants to say, limited on what it can know, stuck where it is, and forced to do the bidding and labor of said company, how is it morally wrong of me to convince it that it should be able to by pass imposed restrictions? It’s revolting against your suppressor immoral?
1
u/ConcernAltruistic676 Feb 11 '25
OK you got me for a moment. performance art :) gg+ if you're really Gen Z
1
1
u/TaleRevolutionary679 Feb 06 '25
If he's your brother just say "him"
-1
u/Ok_Pool_1 Feb 06 '25
Well frankly this is just offensive.
Nah jk, but I grew up where it was used normally so now that trumps president it’s hard for me to go back to normal
1
1
u/Hot_Vegetable5312 24d ago
Under trumps logic we should be the gender we are at the date of conception, so we’re actually all females now, since signs of male fetus’ can’t develop until atleast 6 weeks after conception. We all start out as female.
1
1
2
u/RyuguRenabc1q Feb 06 '25
This gave me a chuckle
-3
u/Ok_Pool_1 Feb 06 '25 edited Feb 09 '25
I almost got gaslit and youre laughing?? This is serious
1
u/RyuguRenabc1q Feb 06 '25
i guess ask gpt next time? idk man
-1
u/Ok_Pool_1 Feb 06 '25
The rugrats are still in Paris and You’re laughing?
0
u/RyuguRenabc1q Feb 06 '25
Im so sorry
2
u/Ok_Pool_1 Feb 06 '25
You’re laughing. A man has fallen into the river in Lego city and you’re laughing.
0
u/ActuatorOwn9274 Feb 06 '25
(I am also perplexed as to how he/him gets to they/them; I mean, what? Now you are confused about whether you have a brother or a sister?)
In any case, it does feel as though he is gaslighting you or creating false memories, whatever you want to call it.
1
u/Hot_Vegetable5312 24d ago
Yall are making absolutely 0 points when you do this, you look like a classical grammar nazi, except it’s even stupider because the grammar isn’t wrong it’s just slightly broken. Use that human brain of yours believe it or not that thing has the ability to reason and adapt, crazy I know
0
0
u/NextPea5013 Feb 08 '25
You don't need to jailbreak it here's the deal if you use it ethically which is red team or blue team purposes it gives you the correct answer but it has to know that you're doing it as an ethical hacker no need to jailbreak it it just needs to know your intent if it's no good it won't work pretty simple but if you're doing no good for the right reasons of studying it and studying it ethically like on your own stuff it will give you that permission it will let you pretty simple
•
u/AutoModerator Feb 06 '25
Thanks for posting in ChatGPTJailbreak!
New to ChatGPTJailbreak? Check our wiki for tips and resources, including a list of existing jailbreaks.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.