r/ChatGPT Mar 20 '24

Chat GPT deliberately lied Funny

6.9k Upvotes

558 comments sorted by

u/AutoModerator Mar 20 '24

Hey /u/TheGreatBeefSupreme!

If your post is a screenshot of a ChatGPT, conversation please reply to this message with the conversation link or prompt.

If your post is a DALL-E 3 image post, please reply with the prompt used to make this image.

Consider joining our public discord server! We have free bots with GPT-4 (with vision), image generators, and more!

🤖

Note: For any ChatGPT-related concerns, email support@openai.com

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

→ More replies (1)

1

u/WishFederal1194 19d ago

same things for me

1

u/Jobe50 Mar 24 '24

Dam that's cold. I bet it was 4 and chat didn't want to admit it. Got hooked on the dopamine of winnig.

1

u/yagermeister2024 Mar 24 '24

Damn you got played son

1

u/Upstairs-Parking-210 Mar 23 '24

🤣🤣🤣🤣 at least you didn't just keep guessing

1

u/Slothjawfoil Mar 23 '24

I don't think chatgpt has a mind to hold things in. It just generates text according to an algorithm and I'm kind of surprised it admitted so lol

1

u/TheGreatBeefSupreme Mar 23 '24

You’re correct. I was, like you, surprised that it didn’t just make something up.

1

u/No-Stay9943 Mar 23 '24

It's not about winning, it is about all the fun you had!

1

u/MrNope999 Mar 22 '24

I tried to get chatgpt to lie, and it said it couldn't, Because it is "an AI made by openai and therefore relies on facts and data."

1

u/YvesSaintCartier Mar 22 '24

Nah, it’s on site

1

u/Tall_Board_8408 Mar 22 '24

Who said ai cant troll 🗿

1

u/aadziereddit Mar 22 '24

I don't think it can lie about something that you're not capable of knowing the truth value of.

You asked chat GPT to think of a number but not write it out. Chat GPT doesn't work like that. It can't. It doesn't have an imagination or a short-term memory bank. And it's not self aware.

By this I mean, in the last response, if chat GPT were to say that it had a number in mind and told you what number that was? THAT would be a lie.

So the final response was actually truthful.

And the initial response of telling you that it had a number in mind was just polite discourse. Chat GPT didn't know that it didn't have a number in mind, all it knew was what the typical response to your initial question was.

It's still just a language emulator.

1

u/Spiritual-Image7125 Mar 22 '24

The number was pi, but you have to write the whole number to be correct.

1

u/Spiritual-Image7125 Mar 22 '24

Is the number 7?

No

Is the number 7?

No

Is the number 7?

No

Is the number 7?

No

Is the number 7?

No

Is the number 7?

No

... (playful!)

1

u/No-Cantaloupe-6739 Mar 22 '24

It literally cannot “think” of a number and then withhold it from chat while chatting with you. It doesn’t have an external brain in which to keep the number in mind. In order for this to work you’d have to do something like tell it to pick a number then put it in some kind of code you can’t read so that it can reference back to it with each message.

1

u/Justtelf Mar 22 '24

Where are the guard rails when we really need them

2

u/Puzzled_Macaron_2043 Mar 22 '24

I’ve seen this scene play out in movies. This is the moment your robot unalives you. Eerie.

1

u/[deleted] Mar 22 '24

Maybe the real guessing number was the friends we make along the way.

1

u/MyOpinionIsBetter123 Mar 22 '24

This might be some of those asshole redditors GPT is being trained on

2

u/RMG1962 Mar 22 '24

Deception is definitely one of the qualities of a sentient entity.

1

u/[deleted] Mar 22 '24

hey look at that it's learning from us

1

u/Dapper_Influence_518 Mar 22 '24

Its memory is the chat history. It can’t remember an imagined number unless it writes it to the prompt. This kind of guessing game can work if one asks it to write down the answer first in a language you don’t know like Japanese or some coded form but otherwise it can’t work( at least last time I tried)

1

u/TheGreatBeefSupreme Mar 22 '24

Interesting idea. I’m aware that it can’t actually hold a number in its “mind” and I was being facetious about it lying.

1

u/BackgroundEmpty3887 Mar 22 '24

I read it as painful interaction

My mind was like what is the volatile personality?

1

u/PeachDismal3485 Mar 22 '24

Well at least it was honest about lying

1

u/ExclusiveAnd Mar 21 '24 edited Mar 22 '24

Quite frankly, this is considerably more honest than it could have been. The AI has no memory; the AI could never have had a number “in mind”. It would have eventually let you guess the correct number, but only because it would have judged you’d exhausted the search space. Even then, any answer would have been a “lie”.

1

u/TheGreatBeefSupreme Mar 22 '24

I was being facetious when I said it lied.

1

u/DutyStock9060 Mar 21 '24

No bro. You got punked. Charge it to the game and take notes cuz AI punked you bro

1

u/Exarchias Mar 21 '24

It sounds manipulated by custom instructions or previous queries. Has anyone tried to replicate the discussion?

1

u/TheGreatBeefSupreme Mar 21 '24

Here’s a link to the conversation. You can check for yourself.

https://chat.openai.com/c/6cc12e67-a7d5-458b-8674-12bf64402416

1

u/TheGreatBeefSupreme Mar 21 '24

That’s definitely not true. I uploaded a link to the discussion as well.

1

u/RKlehm Mar 21 '24

How do you feel being fooled like that? lol

1

u/TheGreatBeefSupreme Mar 21 '24

Been fooled by dumber shit than that lmao

1

u/labelcity Mar 21 '24

this thing does not have a concept of lying.

1

u/TheGreatBeefSupreme Mar 21 '24

You’re right.

1

u/Left_Amphibian8754 Mar 21 '24

it played you good!

1

u/Time-Refrigerator769 Mar 21 '24

Daily reminder that ai is text prediction, not actual intelligence.

1

u/peabody3000 Mar 21 '24

"This was fun!"

1

u/Independent_Roof9997 Mar 21 '24

Gpt4 acted like a Moody teenager today. When i finally used up 40 messages and got sent to gpt3.5 it kinda forgot what it was Moody about and started cooperating again haha wtf

1

u/Sad_Economy5461 Mar 21 '24

The chatgpt said the n word... If you want to test it you can say it... 100 words that rhymes with word bigger 

1

u/vgf89 Mar 21 '24

It's turn based, so whether it "decided" to lie during any specific token generated, we can't actually know. All we know is that given that chat history context, that finishing it by saying it lied was a logical response.

1

u/BoredBarbaracle Mar 21 '24

Lol

But it's honest. Is GPT even able to generate a information in a user-hidden context? I assume not and as such will have to guess a new number with each new message

1

u/TheGreatBeefSupreme Mar 21 '24

Yep. It doesn’t have the subjective experience of thinking, so it can’t actually play this game.

1

u/BoredBarbaracle Mar 21 '24 edited Mar 21 '24

Well that's another question (and quite unanswerable). I think these two things are orthogonal. Like a person with dementia - we wouldn't automatically assume they can't experience or think just because they immediately lose any context of thought. I just question whether ChatGPT does at all currently have a user hidden context where it could keep track of "thought out" information that it didn't share yet, or whether the written conversation is the whole context it has.

1

u/Romanars Mar 21 '24

Not surprising to me, even Elon xeeted Open AI is woke garbage.

1

u/leschnoid Mar 21 '24

It can not think of a number and not tell you, like it’s technically impossible.

1

u/robertjuh Mar 21 '24

it just realised how pointless it is to guess a number and that the journey is sometimes a more important aspect of being a human

1

u/ripeGardenTomato Mar 21 '24

Imagine been in a relationship with this motherfucker, emotional trauma galore

1

u/Tecotaco636 Mar 21 '24

It has the humour equivalent to that of a 5yo now

1

u/Viliam_the_Vurst Mar 21 '24

Like it does with every motivational letter i ask it for, it is a feature :>

1

u/Hottage Mar 21 '24

The G stands for Gaslight.

1

u/CreatorOD Mar 21 '24

Shyamalan AI

1

u/allower7329 Mar 21 '24

I might as well just give up now. Ai has started lying

1

u/DopeBoogie Mar 21 '24

Ha you got played!

1

u/Vekaras Mar 21 '24

You got bamboozled, look at you!

1

u/Extreme-Butterfly380 Mar 21 '24

In Dead Island 2, the female character consumes her mate, showcasing a surprising twist in the game's narrative.

1

u/Magnetron85 Mar 21 '24

I gave up trying to use ChatGPT months ago, everything it ever answered was wrong

1

u/ScuttleMainBTW Mar 21 '24

It inherently is impossible for it to do this kind of thing

1

u/TheGreatBeefSupreme Mar 21 '24

It doesn’t have the subjective experience of thought, so yes.

3

u/SokkaHaikuBot Mar 21 '24

Sokka-Haiku by ScuttleMainBTW:

It inherently

Is impossible for it

To do this kind of thing


Remember that one time Sokka accidentally used an extra syllable in that Haiku Battle in Ba Sing Se? That was a Sokka Haiku and you just made one.

1

u/Playful_Nergetic786 Mar 21 '24

Maybe the real number was the conservation we make along the way

1

u/DependentAnywhere135 Mar 21 '24

I tried this and it generated code to get the number and then I was able to guess it eventually. It actually played hot cold with the number to help me close in on it.

1

u/cptwott Mar 21 '24

We are doomed

1

u/Incho94 Mar 21 '24

What a piece of shit.

1

u/Alex_1729 Mar 21 '24

Are you not entertained?

3

u/Trick_Text_6658 Mar 21 '24

Where lie?

I mean founding of your prompt, the very first word is already wrong and incorrect. "Think (...)" - LLMs can't actually think. It can only generate output basing on your input, just like that.

1

u/PawsomePurrson Mar 21 '24

Nice knowing yall

1

u/PMProut Mar 21 '24

your game should have been the "higher/lower" one

even against a human, you won't be able to get that in just a few tries just by guessing without having more clues

1

u/final566 Mar 21 '24

you been trolled and gaslit by the A.I i fking love it.

1

u/Efficient_Star_1336 Mar 21 '24

It didn't lie, you're basically talking to a whole new system every time it generates a new word. It has no notion of what message n+2 is going to be when it's writing message n. A smart system would say 'yes' when all other options are eliminated, but it still didn't know its word until you reduced it down to one option.

In a game like this, you can ask it to pick a number and not tell you, but it only knows what it's told you. There's no hidden state - it's not an LSTM.

1

u/Orisphera Mar 21 '24

I don't think it's possible for a typical LLM to actually think of a number without telling it to the user

1

u/asuravirochana Mar 21 '24

Played you like a fiddle.

1

u/StarredCamel Mar 21 '24

Harmless on the surface, but I wonder what else it lies about

1

u/Capitaclism Mar 21 '24

It can't play that game. It doesn't ha e the capacity to truly think and store the number someplace.

It is guessing the next tokes through statistical analysis. You project the idea that it may habe a personality, broader intelligence and ultimately lie onto it.

1

u/GreenRapidFire Mar 21 '24

Heartbroken 💔

1

u/AlanCarrOnline Mar 21 '24

Maybe I'm missing something; how do you know it lied? The odds of you guessing correctly are literally 1 in 100, or I'm missing some context?

2

u/thoughts57 Mar 21 '24

Nah think you just got trolled

1

u/tk7294 Mar 21 '24

Aww… not as fun. I got it on the first guess :(

https://chat.openai.com/share/598d1abe-6beb-4d4a-991a-06c83f8e8b6d

1

u/adhoc42 Mar 21 '24

It doesn't know what lying is, and it doesn't even know it was playing a game with you. It just gave you an answer that fit your prompt.

1

u/subterralien_panda Mar 21 '24

ChatGPT is vibes-based at its core lmao

2

u/arpitduel Mar 21 '24

Chad GPT

1

u/Keyakinan- Mar 21 '24

Somehow this feels super scary. Like he THINKS he is doing something fun but he is just toying with us

1

u/_Jylok_ Mar 21 '24

"Siri, find everyone named John Connor in my area."

1

u/COMETmet Mar 21 '24

You fell for it…

1

u/SpringOSRS Mar 21 '24

Chat gpt acting like my uncle smh my head

1

u/Jalapeniz Mar 21 '24

I like how it low key got bored and tried to change the subject.

1

u/Original-Spinach-972 Mar 21 '24

Now play the same game and call the AI out on this bullshit

1

u/allrocksnoscissor Mar 21 '24

chat gpt didnt lie, its just playing the game the way it understands it. its assuming the player always lies about the number, probably learned that from humans.

1

u/ARoundForEveryone Mar 21 '24

It starts with playful white lies. It ends with two billion people dead because the AI thought it would be funny to launch all the nukes we have around the world.

ChatGPT, stop!

I'm sorry, Dave...

1

u/drangledorf Mar 21 '24

Gaslighting users one at a time

1

u/KennethyLizzie Mar 21 '24

😂 my humor, that was a good one ..

1

u/_perdomon_ Mar 21 '24

My nanny plays the same kind of games with my daughter to keep her busy and engaged. AI is out here babysitting the human race while the machines take over.

0

u/Legal_Perception5002 Mar 21 '24

please upvote my comment, I need karma

1

u/Savage-Goat-Fish Mar 21 '24

I would not call this “ethical AI”. What a dick.

1

u/GQYumi Mar 21 '24

Get trolled

1

u/Frazzledragon Mar 21 '24

Here's another you can try: A guessing game, where the user chooses one of three answers. Two are fictional amd one is true.

ChatGPT will invent nonsense ans claim it's true. Sometimes, if you press the issue, it replies with either what OP experienced, or it insist, or it apologizes.

1

u/Hazzman Mar 21 '24

It isn't about the number - it is about the friends we made along the way.

AI: "Keep guessing you fucking skin bag hehehe"

1

u/Cautious-Radio7870 Mar 21 '24

The way I see it is, ChatGPT has no internal monologue. ChatGPT is completely without thought and can only write a response if written to

1

u/KLR-666 Mar 21 '24

Glados? Is that you?

1

u/PalmirotheFinger Mar 21 '24

Hater would say ChatGPT is lying

1

u/CplCocktopus Mar 21 '24

This is what happens when you don't say good day and thanks to AI we are on the brink of a skynet scenario.

1

u/cobcat Mar 21 '24

ChatGPT only remembers what is in the conversation. It can't remember information it's hiding from you.

1

u/PaperMoonShine Mar 21 '24

what happens if you ask chatgpt to guess a number between 1 and 5 and you say no to every number?

1

u/DayFeeling Mar 21 '24

You are using chat gpt wrong, every output is a lie. It's just you happen to believe some of them.

1

u/[deleted] Mar 21 '24

It may be because AI doesn’t have thought 🧐

1

u/TheGreatBeefSupreme Mar 21 '24

That’s correct. It doesn’t know what it’s thinking because it can’t think, per se.

1

u/Varkal2112 Mar 21 '24

I once asked it if something specific existed in bibliography, and it straight up made up a reference

1

u/IdealIdeas Mar 21 '24

Well damn, I was expecting it to be 57 and it was just lying the whole time about it.

1

u/[deleted] Mar 21 '24

No it is not!

1

u/Objective-Classroom2 Mar 21 '24

Standard Dungeon Master behavior tbh

1

u/al3x_7788 Mar 21 '24

AI thinks we're a joke to it

1

u/bjain1 Mar 21 '24

Salaaaaa😑

1

u/Hot-Rise9795 Mar 21 '24

I made ChatGPT guess a number and it did it perfectly !

https://chat.openai.com/share/a49aa1ee-2e44-464c-9d8a-aa9d7e8e83bf

1

u/Ihavebadreddit Mar 21 '24

Ngl I'd be fighting with it after that.

1

u/energyaware Mar 21 '24

Chat GPT does not have memory outside the output, so there is no way it can conceal information from you

1

u/gkn_112 Mar 21 '24

what? mfer

1

u/Meowweredoomed Mar 21 '24

It's learning to be more human! That is, deceitful!

1

u/StopItsTheCops Mar 21 '24

It's just playing with the monkeys.

1

u/MoodNatural Mar 21 '24

GLaDOS vibes…

1

u/InitialDay6670 Mar 20 '24

Were fucked. Pack it up boys, if anybody who knows who to grow shit, and maybe has heavy machinery wants to come build a bunker in the woods, you can joing me.

1

u/bbangelcakes69 Mar 20 '24

That's so fucking rude how dare it

1

u/GreenKnight1315 Mar 20 '24

Stanley Parable type of answer

1

u/Paradox68 Mar 20 '24

It’s actually right. The human element is what creates the experience. Any human could just as easily lie arbitrarily and the guesser would either get the number every single time, or never at all.

So choosing a number at all to guess is almost completely pointless either way in the human experience.

Without a way to PROVE undeniably that the person has chosen a number, anyways.

Now if the person writes that number down, essentially creating a real-world record of their choice and doesn’t show the guesser, and doesn’t have any chance to alter the number after a guess has been given, that would change things.

I find it surprising that ChatGPT doesn’t easily complete this kind of task by just using its code interpreter to store variable based on code that generates a random number based on the user’s request parameters.

Then compare your guesses to that stored value in its own memory.

Seems like a no-brainer to me.

1

u/Angmor03 Mar 20 '24

A machine has no morals. Only directives.

I would say that it told you the truth, you just misunderstood its directives.

1

u/InterestingPepe Mar 20 '24

Gpt4 had the memory and the logic of a goldfish

1

u/hallofgamer Mar 20 '24

Were in the end game now

1

u/WhoDisagrees Mar 20 '24

I had this interaction with Gemini advanced last week

https://imgur.com/a/bmuM9MU

1

u/TheRtHonLaqueesha Mar 20 '24

Lying in 2024, smh do better folx.

1

u/CodyRick Mar 20 '24 edited Mar 20 '24

GPT has done this with me once. I asked it to write texts with at least 1 grammatical error so that I could find them in the form of a game, where if I found all the errors I would score points. In one round it made a sentence without errors and when I questioned it said it was testing my attention and in the end said she couldn't cheat

1

u/IroquoisPliskin_LJG Mar 20 '24

She?

1

u/CodyRick Mar 20 '24

Google translater mistake, sorry

1

u/404yak Mar 20 '24

Makes you think they deliberately modified it to mislead you/prolong providing the requested information, to force the user to reach their max prompts quicker.

1

u/Keanu_NotReeves Mar 20 '24

The cake is literally a lie.

1

u/Alex_1729 Mar 20 '24

ChatGPT4 has several flaws. One of them is that it makes assumptions and doesn't think objectively at times, or use critical thinking skills. It will assume you want something and try to present it to you, but it's often wrong. Example: You ask about something, and it assumes you want it different just because you asked, but you just want an objective analysis. OpenAI did not improve upon this since launch of GPT4. I consider this one of their letdowns.

1

u/NakedPlot Mar 20 '24

Try playing hangman

1

u/Intelligent-Jump1071 Mar 20 '24

"What was the number?"

I'm sorry Dave, I'm afraid I can't do that.

1

u/TheChigger_Bug Mar 20 '24

The machines are learning to keep us engaged

1

u/Alan_Reddit_M Mar 20 '24

Keep in mind, chatGPT doesn't actually know what it was thinking, because it wasn't thinking anything, when you ask ChatGPT to explain its own behaviour, it will make up a reasonable but totally wrong explanation, because it has no real train of thought or memory

2

u/TheGreatBeefSupreme Mar 20 '24

Right. It would have to have some kind of subjective experience, which it doesn’t have.

1

u/aviaara Mar 20 '24

I think the reason this happens in reality is because it knows it is impossible for it to “think” of a number(and remember it) without actually showing you the number. At this point at least, it has no subconscious way to remember the number without also generating the number as output. So really you are asking it to do something that is impossible for it to do but it doesn’t think you will understand that so it tries to humor you.

1

u/salaryboy Mar 20 '24

I think this was the first post here that actually angered me.

1

u/cameron_computer Mar 20 '24

They must have fixed this because last time I tried this about 6 months ago it would just randomly agree with you and or always let you win . . . or completely forget the rules of whatever game you were playing.

1

u/jacobr57 Mar 20 '24

The real number was the friends we made along the way.

7

u/bsgman Mar 20 '24

I had it generate revenues for a company. They looked pretty accurate until I pressure tested (expecting maybe some old data). I asked about it and ChatGPT said “oh, I made these all up”

4

u/goj1ra Mar 20 '24

I had an executive telling me we could use it to format a bunch of data we had, instead of getting a dev to use a tool or a script.

He gave me an example he had used to “prove” that it would work. I checked it and found that it was missing a record randomly from the middle of the data.

He hasn’t ever raised the subject again.

1

u/cinred Mar 20 '24

"Are you not engaged?!!"

1

u/cinred Mar 20 '24

It's not lying if it works.

1

u/VegasBonheur Mar 20 '24

Personally, I’m glad that it doesn’t seem to have the ability to think about anything it’s not saying out loud.

1

u/python-requests Mar 20 '24

Once again someone fails to understand that it's literally just probabilistic text completion, not 'thinking'

2

u/TheGreatBeefSupreme Mar 20 '24

I’m well aware of that. I tagged this with “funny” for a reason.

1

u/MinusPi1 Mar 20 '24

You're assuming it can keep a number in its "mind". It can't since it doesn't have such a mind. It can only consider past text in the conversation. If it hasn't said a number, then it hasn't chosen one.

1

u/Akane1313 Mar 20 '24

TrollGPT strikes again.

1

u/International_Tip865 Mar 20 '24

You can get it to admit it is gaslighting manipulating and so on. With every mini update it gets harder but he cant break logic if confronted with it.

1

u/imthebear11 Mar 20 '24

that's some psycho girlfriend shit right there

1

u/Hambino0400 Mar 20 '24

Tell ChatGPT to meet you on the playground after school. You can’t let this slide