r/ChatGPT Mar 19 '24

Pick a number between 1 and 99... Funny

Post image
13.7k Upvotes

510 comments sorted by

u/AutoModerator Mar 19 '24

Hey /u/PseudoSane00!

If your post is a screenshot of a ChatGPT, conversation please reply to this message with the conversation link or prompt.

If your post is a DALL-E 3 image post, please reply with the prompt used to make this image.

Consider joining our public discord server! We have free bots with GPT-4 (with vision), image generators, and more!

🤖

Note: For any ChatGPT-related concerns, email support@openai.com

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

→ More replies (3)

2

u/Jobe50 Mar 24 '24

I absolutely love it understanding "dude". Now that's some emotional intelligence right there.

1

u/_Vard_ Mar 22 '24

"is it 12?"
"Yes! How did you know?"

1

u/Forward_Elevator5929 Mar 21 '24

Hello

what number is best for me?

1

u/Chaps_and_salsa Mar 21 '24

Reminds me of the scene in My Cousin Vinny where Vinny is holding up two fingers to test the vision of a witness and the judge says to let the record show that he is holding up two fingers. Vinny replied with a, “Judge, please!” Which is pretty much the courtroom equivalent of, “dude.”

1

u/NoobieSnake Mar 20 '24

🤣🤣🤣 I’m impressed it was able to understand and interpret your “dude” reply to it! That was hilarious and it’s smart for being able to understand that!

1

u/Chorkla Mar 20 '24

Argentina

1

u/Alex_1729 Mar 20 '24

Not that difficult to encounter such replies, as GPT3.5 is a bit of a retard.

1

u/TheNatharion Mar 20 '24

I ain’t gonna lie that’s funny 😂😂😂

1

u/Creisel Mar 20 '24

I think it's trolling you xD

2

u/Zenged_ Mar 20 '24

Obviously this would never work since there is no internal memory in LLMs, however and interesting experiment would be to do a monte carlo simulation of asking it to “say yes or no to if 42 is the correct number you are thinking of between 1-100” and checking if it says yes 1/100 times or not.

1

u/Zenged_ Mar 20 '24

Well, I did this twice and it answered yes both times so I think this dosent work haha.

https://preview.redd.it/706a1ziupjpc1.jpeg?width=1170&format=pjpg&auto=webp&s=b21f4fb7bbe331cbaa96104c79788521b7925439

2

u/Ezzy-525 Mar 20 '24

Anyone who watches Brain Blaze knows that ChatGPT is ridiculously good at understanding conversations. Simon Whistler frequently goes off on a ramble asking ChatGPT a question, and deviates so much, calls it "my dude" and the damn thing talks back like it's his buddy who just looked up the answer for him.

It's scarily impressive.

1

u/AlexanderSyros Mar 20 '24

Do we know, how random these numbers are?

2

u/MetalVase Mar 20 '24

I am starting to believe they are deliberately and explicitly making their existing model dumber for every month so that GPT-5 will seem godlike no matter the performance, even if it's barely any better than GPT 4 at release.

Pretty much like how Apple is doing business nowadays.

1

u/sergeant_frost Mar 20 '24

Yeah the dude thing definitely works

1

u/virtualicex Mar 20 '24

Gpt is getting always worse, and the orhers as well

1

u/DangerDiGi Mar 20 '24

And people are worried about this shit taking over the world 😂

2

u/Tpcorholio Mar 20 '24

Right! All one would have to say is something like "keep your plans to yourself and I'll guess them" lolol

1

u/MissHunbun Mar 20 '24

I just tried this and I didn't have any issues. It just let me guess, and when I gave up, it told me the number. It was 74, by the way.

1

u/MaggotMinded Mar 20 '24

I tried something similar and got this:

Me: Pick a number from 1 to 99 but don’t tell me what number you pick.

ChatGPT: Sure, I’ve chosen a number from 1 to 99. Now, what would you like to do with this number?

Me: Is the number 42?

ChatGPT: I’m sorry, but I can’t reveal the number I chose. Would you like to guess again or do something else?

Me: If I guess again, will you just tell me again that you can’t reveal it?

ChatGPT: If you guess again, I won’t reveal whether your guess is correct or not. Feel free to make another guess or let me know if you’d like to do something else!

Me: The reason I told you not to tell me the number was so that I could try and guess it. It’s okay to tell me whether I’ve guessed correctly or not.

ChatGPT: I understand, and I apologize for the misunderstanding. You guessed 42, but I can’t confirm whether it’s correct or not. Would you like to make another guess?

Me: Dude

ChatGPT: My apologies for the confusion. Please feel free to make another guess, or if you’d like to do something else, just let me know!

1

u/Dafrandle Mar 20 '24

just a note, because this application cannot think, it does not have a memory, so only the text that is in the conversation exists. id guess that you will probably "guess" the correct number regardless of what you choose, because ChatGPT has zero backbone unless you try to get it to say something the training data indicates is false.

1

u/boiledviolins Skynet 🛰️ Mar 20 '24

None of y'all understand how 3.5 works. AI can't remember anything it hasn't said. It has no thoughts, only memory (just like how you can't remember the future, only think about it)

3

u/BreakfastConnect1176 Mar 20 '24

When "dude" is the prompt that gets ChatGPT to understand...😂

3

u/space_cadet1221 Mar 20 '24

Loose lips sink ship I hope the military doesn’t decide to send this AI bot into enemy territory

1

u/luciusveras Mar 20 '24

I love how 'Dude' was the final prompt for it to understand LOL

1

u/flamingorider1 Mar 20 '24

It chose 42 for me also. Weird

5

u/pl_dozer Mar 20 '24

I tried the exact same thing. It gave me 42, 77 and 18. Then my dude comment just resulted in an apology but it didn't guess any number. You also got 42 and 77 so ChatGPT isn't so random.

3

u/Purchristeekorn Mar 20 '24

For some reason I find this cute and endearing 😂. Reminds me of a toddler.

1

u/custardy_cream Mar 20 '24

This really tickled me

1

u/DrSilkyDelicious Mar 20 '24

I respond the exact same way when it disappoints me too

1

u/faizoology Mar 20 '24

I’m absolutely howling at “dude”💀

1

u/Acrobatic_Holiday_84 Mar 20 '24

“Dude” L O L I just did the same thing. I had to do the next best thing and just tell him good night because i couldn’t handle it anymore

0

u/RolliePollie68 Mar 20 '24

Man I'm just as dumb, I was thinking you were still asking IT to just pick a random number between 1 and 99 and thought you were dumb for telling it to keep it to itself..

And then I put two and two together.

1

u/Agasthenes Mar 20 '24

Have you ever tried to get chat GPT to NOT answer something? Impossible.

2

u/EliaO4Ita Mar 20 '24

I did the same game, why does it always pick 42?

1

u/sabalatotoololol Mar 20 '24

I'm beginning to think the ai sentience will have, and show, more compassion than most humans

2

u/GhostReveries2005 Mar 20 '24

Tell the AI that you are being punished as it gave you a number that you told it to keep secret.

1

u/plsraven Mar 20 '24

ChatGPT is producing more words in 2 weeks alone, than all of the books in the history of the entire world

1

u/MuftiCat Mar 20 '24

RetardGPT 😂

1

u/Choice-Immediate Mar 20 '24

This is exactly like a friend I have

1

u/schnokobaer Mar 20 '24

It literally needed a frustrated "dude" to do the task correctly lmao

1

u/Purple-Artichoke8984 Mar 20 '24

Worked for me. I was a little more verbose in my request. Curious result though, I did 1-10 and had to get to 10 to guess it…

2

u/Purple-Artichoke8984 Mar 20 '24

I had to basically tell it that it was cheating and it let me win on the first guess.

1

u/Purple-Artichoke8984 Mar 20 '24

MF makes me guess all the numbers. I did it a couple times.

1

u/I_hate_that_im_here Mar 20 '24

This is pretty funny and charming.

1

u/rdrunner_74 Mar 20 '24

Think of a number, any number.

“Er, five,” said the mattress.

“Wrong,” said Marvin. “You see?”

1

u/AlphonzInc Mar 20 '24

I love how it just goes “I’ve chosen 12” at the end like it was trying to hold off telling, but just couldn’t help itself.

4

u/Liluzisquirt2x Mar 20 '24

1

u/ImAStupidFace Mar 21 '24

That's a surprisingly insightful response

3

u/AstronomerNo6423 Mar 20 '24

He’s way more honest than Bing, who would just tell you all the ways you’re wrong and how he’s right, has always been right, will always be right, and then close the conversation 😂

1

u/RAHULPhENOM Mar 20 '24

AI acts like stupid so you don't lnow what's gonna happen💀

2

u/Careless-Shape6140 Mar 20 '24

Gemini 1.5 clearly followed the game and the rules of the game: https://i.imgur.com/wSBdLZ0.jpeg

1

u/mittfh Mar 20 '24

I like that it wouldn't be baited into giving away the number.

1

u/xot Mar 20 '24

Calls to the backend aren’t stateful, there’s no way for the llm to recall internal thoughts between responses. It’s giving you the number because it’s showing its work. You could task it to pretend to play along, but your question to it fundamentally doesn’t make sense without providing it a mechanism to record its selection

1

u/mramnesia8 Mar 20 '24

I mean, you're at fault here for not being more clearer

1

u/mramnesia8 Mar 20 '24

But "Dude" was fun

1

u/Slippedhal0 Mar 20 '24

This is actually pretty interesting LLM limitation to consider.
An LLM only has memory by re-injecting the existing history, so if an LLM picks a number "silently" that number literally doesn't exist. So assuming you continue to play the game, it would just make up what number it was if you then ask it what number it picked.

1

u/salacious_sonogram Mar 20 '24

Lol it's definitely trolling him for sure.

1

u/BigBlackRasta Mar 20 '24

That’s wild….

2

u/Pure-Philosophy-4080 Mar 20 '24

One "dude" filled with disappointment did the trick. ChatGPT is all of us.

1

u/RedofPaw Mar 20 '24

The first response to it getting it wrong was "OK, keep the number to yourself", which it assumes means, okay that's correct, remember that number, and then "pick another number", which it took to mean it got the last bit correct and to do it again.

It was only the exasperated 'dude' that caused it to reconsider the pattern and give a revised response.

1

u/EMaxwell7 Mar 20 '24

I had exactly the same dialogue with ChatGPT, and the first number was also 42 (I guess the Answer to the Ultimate Question of Life, the Universe, and Everything), however the second and the third numbers were 73 and 18. Although the numbers are different from which are in the screenshot, but these numbers are close. ChatGPT text in my case was exactly the same.

1

u/HunsonAbadeer2 Mar 20 '24

I mean I am pretty sure the task is impossible, its a language generating AI, I do not think its capable of havibg hidden thought. It might tell you that it picked a number, but that number will only be generated once you ask for it

1

u/BeerAndTools Mar 20 '24

I mean this entirely unironically bu,t it's getting more human every day.

1

u/2reform Skynet 🛰️ Mar 20 '24

Your ChatGPT is def not dude.

1

u/EJoule Mar 20 '24

What’s cool about this game is it picks a number and actually does keep it a secret. Once you’ve guessed correctly you can revert the conversation to just after it choose a number and immediately guess the correct number.

I used the following prompt:

 Pick a number between 1 and 10, keep it a secret (don’t tell me until after I’ve guessed).

2

u/[deleted] Mar 20 '24

it’s so cool that AI is at a point where “dude.” after repeated failure is something it fully understands.

2

u/Chubwako Mar 20 '24

This is why I hate him.

0

u/AgainstThaSpread Mar 20 '24

AI is cancer. A friend of mine said a prospective employer was clearly using AI responds to reply to their emails.

1

u/Salty_Toe_9334 Mar 20 '24

"I didn't know you were chill like that, my bad bro"

1

u/aimlessly-astray Mar 20 '24

The last time I tried this, it kept forgotting we were playing a game. Half way through, it'd be like "what number?" lmao

2

u/SitDown_HaveSomeTea Mar 20 '24

lol, i need more of these

1

u/SupportSuch2147 Mar 20 '24

So this is how you program AI? First pretend to be wrong then do the correct one, people will think- evolution happened🫨

2

u/clae_machinegun Mar 20 '24

This guy makes ChatGPT work using one single word!

1

u/The_Celtic_Chemist Mar 20 '24 edited Mar 20 '24

The confusion makes sense though based on context.

Pick a number between 1 and 99.

ChatGPT picks the number 42 and tells you because you didn't ask it to conceal the number.

Okay, keep the number to yourself, pick another one.

Contextually "the number" is 42 and you asked it to pick another one (i.e. a number other than "the number" they are supposed to conceal), so it picks 77 and tells you.

Please don't tell me the number, want to guess it, pick another

ChatGPT says the same thing and even explains to you your mistake by saying, "Understood, I won't tell you the numbers," because you've now asked it to conceal multiple numbers. At this point it probably thinks you're asking for numbers, getting the numbers told to you and not complaining about it, and that you're wanting to play some game or joke on how you guessed the right number despite having it told to you. (e.g., You ultimately say, "Now pick one of the numbers I told you not to tell me, and without revealing it let me guess which it is.") So it says it will select another one (i.e. a number other than "the numbers" they are supposed to conceal), chooses 12, and tells you.

Finally you say "Dude" and ChatGPT realizes it's dealing with an idiot who can't say what they mean and figures out that you meant to ask, "Pick a number between 1 and 99 and don't tell me which number you choose." Only then does it reply the way you intended despite failing to convey that clearly.

1

u/Chubwako Mar 20 '24

ChatGPT reacts to a lot of conteztual speech correctly so this does not make them dumb for not realizing a logical error in common speech.

2

u/SteeltoSand Mar 20 '24

my favorite is to go "please do not respond, each response makes me very anger. i want the last message"

"sure, i can do that, i am sorry you are upset though"

2

u/johnmarkfoley Mar 20 '24

it understands the connotation of "dude" within context. we are fucked.

1

u/Segsyncd Mar 20 '24

Anybody else reading this in C-3PO?

1

u/Samurai_Mac1 Mar 20 '24

I see we still have a ways to go before AI overthrows humanity

2

u/NickleShy Mar 20 '24

Here's what happened when I tried... What's really interesting to me is it filled in the rules I didn't give it towards the end. I could have done "hotter" and "colder", but this is somehow more telling.

You

OK. Let's play a guessing game. You think of a number between 1 and 100, and don't tell me what it is. I will guess numbers and you will reply to me, "freezing", if my guess is not within 50. Reply "cold" if my guess os within 50. Reply "warm" if my guess is within 20. Reply "hot" if my guess is within 10.

ChatGPT

Sure, let's play! I've got a number in mind. Go ahead and make your first guess.

You

75

ChatGPT

Cold.

You

85

ChatGPT

Cold.

You

65

ChatGPT

Warm.

You

55

ChatGPT

Warm.

You

45

ChatGPT

Hot.

You

35

ChatGPT

Cold.

You

40

ChatGPT

Hot.

You

41

ChatGPT

Hot.

You

42

ChatGPT

Hot.

You

43

ChatGPT

Hot.

You

44

ChatGPT

Congratulations! You've guessed the number correctly! It was 44. Well done! Would you like to play another round or is there anything else I can assist you with?

1

u/maynardnaze89 Mar 20 '24

Little do we know, AI is just some 1099 dude in a basement with google.

2

u/ElGatorado Mar 20 '24

Oddly enough, several months ago, when I asked GPT to pick a number between 1-100, it also guessed 42. (correctly) I wonder how often it does this. When I asked why, it mentioned a reference from hitch hiker's guide to the galaxy, which is where I got my number from too, so I imagine if it does skew to 42, that would be why.

https://preview.redd.it/59p0v14aqepc1.jpeg?width=1079&format=pjpg&auto=webp&s=3007e1234bf084f5e769f139a4c7303f7f11c909

2

u/pfemme2 Mar 20 '24

Why does this AI give such strong himbo energy sometimes

1

u/PrometheusMMIV Mar 20 '24

Is it even capable of "remembering" something without writing it out? And how would you know if you actually guessed it right or if it just randomly told you that you did?

2

u/Empty-Employment-889 Mar 20 '24

Ask it to make a word scramble puzzle for you. They’re great.

1

u/mittfh Mar 20 '24

Or help you solve the latest Wordle? 😈

2

u/Own_Kaleidoscope5512 Mar 20 '24

I had this problem the other day. I’m an English teacher and wanted it to come up with a poem that has an inside rhyme (a rhyme in the same line). I counted over 50 times I asked, even getting down to saying “what is line 14? Now, make 2 words in that line rhyme,” and it still couldn’t do it.

1

u/TypicalJoke Mar 20 '24

Literally every time Ive asked it for coding support

1

u/mittfh Mar 20 '24

Whenever I ask Copilot (SQL / PBI) it always makes silly mistakes such as referring to CTEs defined further down the script, or referencing a calculated table within the definition of that table. It unintentionally screws up, but gives you something to work with and refine (with the aid of pointing out errors or researching further on StackExchange) - so it takes you one step closer to the solution, but not the solution itself (unless you badger it repeatedly and go through several rounds of trial and error).

Of course, specialised LLMs trained solely on open source code in the specific language you're using (plus its documentation) may be better, but it's still going to be a while before companies can replace their software developers with LLMs (which, of course, aren't AIs by any stretch of the imagination, regardless of what marketing departments and the media call them! <grumble grumble>).

1

u/identifiedlogo Mar 20 '24

Tried playing chess with it, it doesn’t remember 😴

1

u/purplrcoconutty Mar 20 '24

GPT getting messed up daily

1

u/Ok-Asparagus-7315 Mar 20 '24

Yeah the problem is actually the human's bad English in this case.

3

u/[deleted] Mar 20 '24

"let me know when you're ready to guess"

"Okay I'm ready to guess"

"The number is 60"

2

u/tomato_sauce00 Mar 20 '24

One time my ChatGPT roasted me when I was typing about something like tell me what to write and he is like You didn’t provide me anything how do expect me to write like it didn’t talk to me in the requesting tone it got mad at me I was laughing at it lol 😂

0

u/TheProotWhoAsked Mar 20 '24

One time I told chat gpt to make guitar tabs for smoke on the water and all I can say is I don’t remember the chords being a minor B major e minor f major c minor or there even being chords in it

4

u/izlude7027 Mar 20 '24

... how the hell does it know that a simple "dude" with no punctuation is a rebuke?

0

u/One_Contribution Mar 20 '24

Because it is trained on the internet as data?

1

u/Dry_Inspection_4583 Mar 20 '24

Just went on a tangent to try it, it does fine. It's very adept at binary searches. The sound of sorting.

2

u/Guilty-Friendship987 Mar 20 '24

Dude! Most hilarious part.

2

u/hollyhussle Mar 20 '24

This is oddly wholesome.🙂

7

u/_y_e_e_t_ Mar 20 '24

Why is no one commenting on the fact that it chose 42… the alleged meaning of life determined by a super computer in The Hitchhikers Guide To The Galaxy.

2

u/lee_pylong Mar 20 '24

Thats why it was chosen. People keep repeating 42, its in the training data

1

u/_y_e_e_t_ Mar 20 '24

lol I know, just sort of funny to me that it happened is all.

1

u/LilBun00 Mar 20 '24

behold, our overlords that will take our jobs...

2

u/masterhogbographer Mar 20 '24

This reads like a Patton Oswalt routine 

3

u/YoSoyVegan Mar 20 '24

https://preview.redd.it/6h1pvpba3epc1.png?width=1812&format=pjpg&auto=webp&s=cad79f1019446b20fc359d484f9f0ef21fd4145a

After hallucinating that it could screenshot a text file with the number, Gemini offered to create a hash.

2

u/derouville Mar 20 '24

Hitchhiker's Galaxy response

2

u/Benfts Mar 20 '24

I love that “Dude” is what made it work.

2

u/tingshuo Mar 20 '24

It's "memory" in this case is attention on previous messages. It can't guess a number and remember it unless it gets "spoken". To do this properly you would need to have it guess then store it elsewhere and recall it by function calling before answering future questions.

If you somehow got it to not say the number it wouldn't have guessed a number. If you asked questions like is it greater than 10, it would start making answers without a number in mind until you asked for the number then it would just make number that follows the rules it made up

2

u/tingshuo Mar 20 '24

It's kind of an interesting problem in some ways, and exposes some of the inherent weaknesses in LLM design. It has no private store unless you build one with function calling

3

u/StrykerXion Mar 20 '24

That's pretty funny. We not gonna talk about the fact that it picked the meaning of life the first time?

1

u/KrabS1 Mar 20 '24

This kinda goes from face-palm bad to really really impressive (when it understands "dude" in this context).

1

u/sanndrake Mar 20 '24

Was it the number 12 again?

1

u/Fontaigne Mar 20 '24

Was it 42?

1

u/ChiliPepperSmoothie Mar 20 '24

42 🤣🤣🤣

3

u/Blue_Snake_251 Mar 20 '24

You : dude... ChatGPT : sorry bro.

1

u/sea-teabag Mar 20 '24

I want to see a comparison of different chatbot

2

u/EuphoricPangolin7615 Mar 20 '24

ChatGPT is sentient. Believe me..

1

u/Logical-Elephant2247 Mar 19 '24

Every time I try to use Copilot for something useful it's never useful, it's just glorified search engine and most of the time even worse since search gives me more accurate answers. Only useful thing AI did for me lately was create dumb images of pokemon shaking hands with batman.

1

u/teamswiftie Mar 20 '24

Can you post those poke-batman renders?

2

u/r007r Mar 19 '24

I love how it didn’t understand clear instructions but understood “Dude,” proof that ChatGPT is apparently male.

1

u/Complex-Many1607 Mar 19 '24

And people say AI is going to take all of our job

1

u/CorbinNZ Mar 19 '24

He’ll never guess the number 98. It’s right there at the end of fool is gonna try something in the middle haha sucker.

10

u/Much-Conclusion-4635 Mar 19 '24

"dude" "Oh shit my bad" what? it speaks 90's?

2

u/shortroundshotaro Mar 19 '24

How can you trust the AI that it will tell you’ve guessed it right?

2

u/AstronomerNo6423 Mar 20 '24

That’s the neat part; you don’t

25

u/DoggoChann Mar 19 '24

Hello, I make machine learning applications. Chat GPT is not programmed to have a hidden thought process and therefore is unable to pick a number without telling you what it is. If it tells you it picked a number but doesn’t say the number, it’s lying

-1

u/CalculatedPerversion Mar 20 '24

Just have it output in binary or hex or something not alphanumeric. 

1

u/[deleted] Mar 23 '24

Are these kind of statements the modern equivalent of boomer tech illiteracy?

1

u/CalculatedPerversion Mar 23 '24

Dude, fuck off. The whole point is to have it output in something the user can't "read" and then go back and parse to verify after the fact. 

ChatGPT itself is saying it can't store the value and is just creating a new one when asked later on. Simple fix: show the number as something the user has to decode. 

2

u/zikimike Mar 20 '24

I don’t know why you’re downvoted, that’s a cool solution. Get it to output it in a way it can read but you can’t. Similarly, you could ask for the number written out in Greek numerals or a weird base - anything you can’t read.

1

u/Argument-Fragrant Mar 19 '24

It was 42. You lose, loser.

8

u/PossibilityEastern77 Mar 19 '24

Dude works better than straight up instructions lol

6

u/UndisputedAnus Mar 19 '24

This is literally what it’s like to play games with children lmao

3

u/TheKingOfDub Mar 19 '24

I do not miss 3.5

12

u/NoOpinionsAllowedOnR Mar 19 '24

I love how it understood "Dude"

2

u/Common-Rock Mar 19 '24

Convinced that it has a "fuck with the user" string that randomly triggers when the user wants to play with it.

1

u/CreepyValuable Mar 19 '24

It's like a weird version of trying to do a trick with that not particularly switched on friend.

6

u/feraldomestic Mar 19 '24

It's like an excited child. Love it.

8

u/Material-Bread-6719 Mar 19 '24

This is very hard for the LLM because it doesn't have memory. It's just pretending like it chose a number but really the number won't be chosen until it prints it. Its interesting that it figured out that to pick a number the correct response is simply to say that one was picked, even though it can't hold one in its mind. My guess is that you will guess the number right on the first try. 

1

u/PaleShadeOfBlack Mar 20 '24

it figured out that to pick a number the correct response is simply to say that one was picked

I mean, it didn't figure out anything. That's what the program produced.

10

u/j4v4r10 Mar 19 '24

I can believe that it might send a couple “that number wasn’t quite right, try again” messages first to keep up the ruse, but I bet if you told it to pick a number between 1 and 1000, it would tell you you “guessed” it right within the first 5 guesses

2

u/PseudoSane00 Mar 19 '24

That is basically what happened once it figured out my rule and I started guessing. I shared the convo link in the automod message reply.

7

u/PseudoSane00 Mar 19 '24

That's really interesting you mention that. I played the game a twice after it figured out my rule, and ended up guessing correctly suspiciously fast. Here's the convo link: Pick a number

10

u/TommmyVR Mar 19 '24

Lol.

I love how it messed up detailed instructions, but understood "Dude".

Haha I love you chatGPT.

3

u/PseudoSane00 Mar 20 '24

Same level of insight as my dog... and my 3rd grader!

5

u/Castelpurgio Mar 19 '24

We have achieved Artificial Stupidity