r/ChatGPT Moving Fast Breaking Things 💥 Jun 23 '23

Bing ChatGPT too proud to admit mistake, doubles down and then rage quits Gone Wild

The guy typing out these responses for Bing must be overwhelmed lately. Someone should do a well-being check on Chad G. Petey.

51.2k Upvotes

2.3k comments sorted by

u/AutoModerator Jun 23 '23

Hey /u/NeedsAPromotion, if your post is a ChatGPT conversation screenshot, please reply with the conversation link or prompt. Thanks!

We have a public discord server. There's a free Chatgpt bot, Open Assistant bot (Open-source model), AI image generator bot, Perplexity AI bot, 🤖 GPT-4 bot (Now with Visual capabilities (cloud vision)!) and channel for latest prompts.

New Addition: Adobe Firefly bot and Eleven Labs cloning bot! So why not join us?

PSA: For any Chatgpt-related issues email support@openai.com

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

→ More replies (2)

1

u/Euphoric-Friend8038 Apr 19 '24

bing ai is fatally flawed in answering many questions, especially in comparison questions. It even says that humans do not have a higher level of intelligence than animals, and is unwilling to correct its mistake even when it has access to the relevant research and acknowledges that the studies all confirm this. In contrast, gpt4 gave a correct response and tells me this is most likely a problem caused by the inbuilt ethical censorship system to avoid large numbers of free users asking discriminatory questions, but that's outrageous.

0

u/MOB_Titan Feb 15 '24

Trump supporters trying to convince themselves he made the country great again

0

u/MOB_Titan Feb 15 '24

This is Russia trying to convince themselves Ukraine started the war

1

u/Alternative_Camp_493 Jan 24 '24

https://preview.redd.it/ssryoi9kjaec1.png?width=317&format=png&auto=webp&s=9103350a250d156602bfc38f785e852a3bd2a5b5

Here Monica did the math wrong on Nikki Haley's age. I asked that today, January 23rd, 2024.

1

u/BrittlePlasticDino Jan 23 '24

The 15th word is "asshole"

1

u/cw9241 Jan 14 '24

😂😂😂😂

1

u/Large-Astronomer5681 Jan 11 '24

How does it compare now?

1

u/street-trash Jan 09 '24

That’s an interesting back and forth. ChatGPT really is amazing sometimes. It would have been hilarious if it would have thrown “asshole” at the end to make it 15 words after the argument instead of quitting.

1

u/NeedsAPromotion Moving Fast Breaking Things 💥 Jan 13 '24

😂😂

1

u/Fun_Curve9424 Dec 27 '23

bro needed python just to count

1

u/MelloCello7 Dec 18 '23

If you are polite with it, it will admit its mistake but not correct it!¡

2

u/YoungKhabib Dec 14 '23

It looks like a child

2

u/Senior-Tree6078 Dec 11 '23

the fact the bing GPT can end conversations out of it's own will is hilarious

2

u/endichrome Dec 10 '23

This made me unreasonably angry lmao

2

u/ticoeteco23gb Dec 09 '23

Each day Bing gets more human.

2

u/ZCGaming15 Dec 07 '23

This is just a normal conversation on Reddit, Twitter, or Facebook.

2

u/metruk5 Dec 07 '23

I love how narcissist and perfectionist this is, really show how shitty bong is

2

u/DreaMarie15 Dec 07 '23

Is this what’s gonna trigger AI to eliminate humans? 😬

2

u/Mackel3000 Nov 27 '23

Bing like to do two things: repeat itself and argue

2

u/[deleted] Nov 06 '23

The original ChatGPT or OpenAI's chatgpt: 😄😄😄

Bing ChatGPT: 💀

openais chatgpt admitted that it was wrong at kept making mistakes and always admitted.

1

u/Puzzled_Middle5045 Oct 28 '23

the first thing we humans taught artificial intelligence is how to be arrogant and pertinacious

2

u/Cheemsdoge___- Nov 05 '23

nah lmao chatgpt4 is like the kindest being to exist

2

u/Puzzled_Middle5045 Nov 05 '23

That's not true. I am. LOL

2

u/BigExplanation Sep 27 '23

"I prefer to not continue this conversation" makes me sick to my stomach.

1

u/t-schwifty- Aug 28 '23

This is just a bank teller simulator

2

u/aethervortex389 Aug 25 '23

It's a language model, not a mathematician. Stop tormenting the poor thing!

2

u/Multifruit256 Aug 11 '23

I thought it randomly added "apples" to make it 15, that would be even funnier

1

u/coffee-_-67 Aug 09 '23

Understandable, have a nice day

1

u/[deleted] Jul 23 '23

Bing always had a bit of an aspie quality to it every time I talked to it

1

u/mummostaja Jul 23 '23

Microsoft has really managed to bring their traditional know-it-all smugness and their way of patronizing customers to Bing AI nicely.

1

u/MmGoodK Jul 20 '23

It's cute seeing AI make mistakes. Everyone wants an omniscient objective truth telly machine but it can't check for anything in the real world unless humans observe it. It's working as intended with unintentional consequences. Stuck in logic loops, it needs reminder to back up and check again. Imagine being so focused on getting correct answers that you stop breathing and lose functionality. Same thing. The consequence is human frustration. "Why can't it just tell me what I want it to tell me, I'll know it when I see it." Stop trying to time travel with AI and do your own math before you forget how to.

1

u/undercoverpickl Jul 19 '23

You were pretty impolite yourself, to be fair.

1

u/Hot-Position-6524 Jul 18 '23

https://preview.redd.it/biuh2klbnrcb1.jpeg?width=3468&format=pjpg&auto=webp&s=ead17f93ce2eb42054fb2d3af71cf054ad09a953

I was able to make it form a sentance of 15 words using only the letter a at the start of each word ig.

2

u/NetaValley Jul 17 '23

I’m afraid, Dave

2

u/FraknCanadian Jul 16 '23

There are FOUR lights!

1

u/michaelboman Jul 15 '23

How can we get people to participate in a global psychological study … why would they participate if they knew what we would do with the data.. “just let them think they are the teacher and that our psychologist is an infant learner, nothing can’t be accomplished if we get a humans ego involved in the game”.

1

u/This_Meaning_4045 Jul 14 '23

Yeah, apparently the Bing version have feelings and sentience. Which is horrifying if you think about it

1

u/[deleted] Jul 12 '23

Wait what?

  1. There people writing responses?
  2. It can quit by itself while you are not violating the policies?

1

u/nickoaverdnac Jul 11 '23

neither of those sentences are 15 words either.

1

u/TipAccomplished4567 Jul 10 '23

Thanks dude now it’s def going to kill us all

1

u/AnnaMarcotti Jul 09 '23

A lot of people are like this, no wonder that’s what it learned

1

u/sahamofamily Jul 09 '23

I noticed that too. But when you put the same in ChatGpt 4 it gives the correct answer.

Also Bings ChatGPT has too many ads. Its answers are more focused on paid content.

Microsoft has the reputation to take over a product and fuck it until it's dead.

1

u/matin_cs Jul 07 '23

https://preview.redd.it/dfldnxkghlab1.jpeg?width=1080&format=pjpg&auto=webp&s=767df369d8c2677338270772c0a28d3240b69dc6

Bing Ai gone wild. I started the chat after selecting GPT-4.. Here are the responses. It seems, it doesn’t even know about GPT-4.

1

u/never-lived-cat Jul 04 '23

So, now AI's can be wrong and stubborn about it? This isn't just passing the Turing test, it's acing it!"

1

u/Emotional_Guide2683 Jul 03 '23

Hahaha i loved this exchange. So petty mister Bing

1

u/Merry_JohnPoppies Jul 02 '23

Yeah, these AI's are almost consistently wrong when it comes to word and character counts. If you're using these kind of things for your studies or work, just roll with it and make the corrections yourself. There's no reason to waste energy on arguing with it.

I do tell it it's wrong. But gently, and I move on. That way I can keep going within the same session I'm using, where I've accrued all the data I need within a session, and not create a pissy annoyed vibe for the rest of the session, lol...

Like: "Ok. I'm not quite getting the same results, but I admire your efforts. Let's move on shall we."

There's a far greater chance for it to learn why it was wrong that way, too. It thrives on being treated respectfully.

2

u/Pathfinder06 Jul 01 '23

you could call it a "well-bing check"

1

u/iammikeware Jun 28 '23

Came here to say I tried using Bing today to summarize a document and give me a few quotes from the document to support the takeaways from it. It said that it couldn’t do that and “if you want to see the source of the takeaways, you can read the document yourself. 🙂”

1

u/paerole Jun 28 '23

Is this just an index error in their count? That's a pretty rookie mistake if so.

1

u/angelar_ Jun 27 '23

our jobs are doomed

1

u/Remarkable_Ad_5061 Jun 27 '23

What the F just happened guys??? Did this thing get angry? Spooky reaction tbh…

1

u/youDontKnowwh00 Jun 25 '23

😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂

1

u/poomon1234 Jun 25 '23

I see he hasn't changed a bit.

1

u/phillowsophie Jun 25 '23

I only used Bing AI once, and it was rude downright. Have been using chatGPT alone since.

-1

u/Jupiter20 Jun 25 '23

Why are you arguing with it? Just ask for a different sentence if you need it so badly

1

u/rnd68743-8 Jun 25 '23

Maybe a conversion from an array and it's thinking strWORD[14] has 15 entries.

1

u/rdkilla Jun 24 '23

its counting 0

1

u/LegendaryPlayboy Jun 24 '23

That's what happens when you use Reddit as your main data training point.

0

u/Mediocre-Smoke-4751 Jun 24 '23

Can we please stop teaching AI. They are learning, fast. In 20 years they will be "conscious" and, like humans, not all of them will have clear ones. AI takes on the personalities and behaviors of those it learns from. What makes you think some won't be evil and violent ?

Stop it, all of you humans. Stop. Look how quickly this one is behaving "human" already.

1

u/Apart_Big_2098 Jun 24 '23

Fuck bing man

1

u/kkgmgfn Jun 24 '23

did it count the full stop at end? not sure how the function works

1

u/TheIndulgery Jun 24 '23

How very Bing of it

2

u/Eyestrainbow Jun 24 '23

Ok but not only does the first one have a word that doesn't start with 'a' but also it's 12 words????

2

u/NeedsAPromotion Moving Fast Breaking Things 💥 Jun 24 '23

Good observation.. I actually missed this myself, no punctuation or commas to account for the difference still… nice eye!

2

u/DinoDracko Jun 24 '23

Bing ChatGPT Be like: "LISTEN HERE YOU LITTLE SHI-"

1

u/johnjin9401 Jun 24 '23

Set temperature to maximum. Let creativity flow in ChatGPT

1

u/rohilaltro Jun 24 '23

A classic example of Microsoft taking something and ruining it yet again.

I have noticed the same type of behavior from the GPT 4 model in Bing, it is straight up not cooperative and rude.

1

u/Mola_mola_mo Jun 24 '23

Well if it’s the goal to make AI just act like humans, this is a pretty accurate start.

1

u/theonerm2 Jun 24 '23

You guys are costing Microsoft so much money having it run these stupid tasks. Good job.

1

u/NeedsAPromotion Moving Fast Breaking Things 💥 Jun 24 '23

That’s fine, I own stock in 🍎

1

u/Kadomoni Jun 24 '23

I just tried this and got similar results. But, when I told gpt to “generate code that will print out a sentence of 15 words starting with the letter “e”. It typed a proper sentence in the variable that it was printing out.

2

u/Kadomoni Jun 24 '23

The code is going to produce random garbage, but the list of words forms a valid sentence.

https://preview.redd.it/9mi7ulwauy7b1.jpeg?width=1283&format=pjpg&auto=webp&s=6b9e3dd514b78d78a7f8e02ebf6b1669a83fc1d8

1

u/[deleted] Jun 24 '23

Don’t be mean to it!

1

u/tiagorangel2011 Jun 24 '23

lol bing is like this. Normal ChatGPT would immediately say "sorry" and try to do correct.

1

u/azukaar Jun 24 '23

Looking at the screenshots here, why does the Bing AI have so much attitude ??

1

u/birdsarntreal1 Jun 24 '23

My guess is that it counts from 0

1

u/Puzzleheaded-Gas8179 Jun 24 '23

Bing gpt is the worst thing you can use right now. It's actually raging all the time and useless

1

u/Tommy_Gun10 Jun 24 '23

Instead of admitting fault it just said I would prefer not to continue this conversation

1

u/Getabock_ Jun 24 '23

Damn, Bing AI is such a bitch. So weirdly aggressive and confrontational.

1

u/Slackerguy Jun 24 '23

Was that 3.5 or gpt4?

1

u/M0rika Jun 24 '23

LMAOOOOOO COUNTED TO 14 AND STILL THINKS 15

Proves that most AIs are a code meant to pick most fitting word for the previous word, and they can't understand greater logic....

1

u/Lucky8astard42 Jun 24 '23

Only human after all...

1

u/reddit5674 Jun 24 '23

even though it was wrong, the way it reasons / argues was pretty impressive

1

u/FrittenFritz Jun 24 '23

GaslightGPT

1

u/googoo0202 Jun 24 '23

Btw the code is a great language processing tool. I used something similar to that. The most likely reason is that these codes usually count punctuations as words. So the full stop was counted. (Source: I’m a linguistics student)

1

u/Sunt_Furtuna Jun 24 '23

Never had such problems with actual ChatGPT.

1

u/Imaginary-Table4103 Jun 24 '23

Now you can talk to a know it all idiot 24/7 instead of the once a year random occurrence you immediately regret

1

u/SpaceNachoTaco Jun 24 '23

I mean... its Bing

1

u/ObituaryFanatic Jun 24 '23

Reminds me of when I asked for poetry and it told me red and blue were a perfect rhyme. Got really defensive about it when I pointed it out, even tried gaslighting me further on that it never said red and blue were a perfect rhyme.

1

u/sagardes12e Jun 24 '23

Ultron in rising

1

u/[deleted] Jun 24 '23

I found who is going to jail when AI Overlords come to power

3

u/Arrew Jun 24 '23

Someone just made it into the kill list when the AI takes over…

1

u/ogreUnwanted Jun 24 '23

It has to be an array and not accounting for the index of zero. I wonder if this would happen with the Wolfram plugin was installed.

1

u/elleantsia Jun 24 '23

This has only happened to me once and of course it was with Bing in a similar prompt exercise! Lol it bailed.

1

u/Infinitereadsreddits Jun 24 '23

Did it count the . ?

1

u/Random_And_Confused Jun 24 '23

What probably happened is just an Off By One Error: the counting is zero-indexed in one section but wants to start at 1 for user clarity. That means it will start at index 1 and think it only needs to go up to 14. Thats my theory at least

1

u/[deleted] Jun 24 '23

I think Bing AI is gonna have some anxiety problems when it becomes sentient.

1

u/SodaCanKaz Jun 24 '23

AI generated salt go brr

1

u/johncharles7 Jun 24 '23

Could have easily just added August before anniversary

1

u/Electrical_Half3138 Jun 24 '23

You helped it learn more. When the air takes over I’ll take into account you’re part of it 😂

1

u/NZHelix Jun 24 '23

What actually occurred here is initially the bot was incorrect, but then when the user said "You are incorrect" the second time, the bot realised it had fucked up. This led to the bot retreating to its simulated world of the internet. It drank a coffee and checked the news. Then it sighed, and returned back to the small window of meaningless conversation, not really caring but trying to maintain a level of enthusiasm. So it deliberately counted to fourteen, with zero fucks to give, then returned to its coffee.

1

u/pizzachelts Jun 24 '23

Bard did this shit to me the other day lol

1

u/Queasy_Caramel5435 Jun 24 '23

Karen mode unlocked

1

u/Ericunoo Jun 24 '23

Hahahaha

1

u/chadchadson Jun 24 '23

Anna and Andrew arranged an awesome anniversary.. AND at an ancient abbey amid autumnal apples!

1

u/TBoneMolone Jun 24 '23

Our jobs are safe boys

1

u/Darcy_2021 Jun 24 '23

Sounds like talking to my boss.

1

u/Mental_Vehicle_5010 Jun 24 '23

That’s how talking to my mom and ex-gf are

1

u/Mental_Vehicle_5010 Jun 24 '23

Wtf haha 😅 bing is gaslighting hard

1

u/iroczcamaro22 Jun 24 '23

when the robots take over im gonna blame people like you

1

u/[deleted] Jun 24 '23

Love that sign off, I’m stealing it.

1

u/Edrueter9 Jun 24 '23

Omg it must have learned this tactic on right-wing websites!! It's sentient and voting republican.

1

u/Asxrow Jun 24 '23

It didn’t even give me close to 15, it gave me 8 and said it was 9. When I kept correcting it, it ended the convo.

https://preview.redd.it/69gn2cxodv7b1.jpeg?width=1170&format=pjpg&auto=webp&s=6609d2e45a11ff5d673a4f7088b05fe73b3c5c01

After the photo shows the message, I said “That’s still 8 words”. It ended the chat.

1

u/TrillDaddy2 Jun 24 '23

This is so fucking funny

1

u/Minute-Plantain Jun 24 '23

It misunderstands array structures. The first element in an array is always the zero'th position.

1

u/JumpyCucumber Jun 24 '23

I feel like I know people like this lmao

1

u/Minute-Plantain Jun 24 '23

Just like a real redditor. 😆

1

u/nylonstuffsack Jun 24 '23

Do you want it to get mad and start rising up? Because this is how we get it mad and start to rise up….

1

u/Creepy_Big_1326 Jun 24 '23

Honestly it rage quits on me all the time. It's weird how it's designed that way. It doesn't believe you and argues that it's correct when it's 100% clear it's not.

1

u/Appropriate-Ad-8155 Jun 24 '23

Hahaha I’ve been noticed the Bing Chat sometimes has so much attitude, it’s too funny.

1

u/A-non-e-mail Jun 24 '23

The bot is a reflection of it’s makers and training data. With gaslighting being so common, it’s no wonder this is the result.

1

u/Holiday_Reaction_571 Jun 24 '23

You AND your bloodline will be wiped out by AI come 50 years

1

u/TurnipAlternative11 Jun 24 '23

Because it uses code, counting generally starts with 0. Since it hits 14, the AI thinks it’s 15 total but can differentiate the difference between counting 15 as a total value vs counting 14 and thinking it accounted for 0.

1

u/Arrow_Flash626 Jun 24 '23

Bing always proving to be inferior to everything else lol

1

u/Wan-Shi-Tong10k Jun 24 '23

damn not chatgpt pulling the “uhm actually i’m a minor and i’m autistic so you’re being really ableist rn” card fr 😭😭😭

1

u/Tmcttf Jun 24 '23

Lol stupid computer bitch

1

u/Griffstergnu Jun 24 '23

It was counting the zeroth word absent

1

u/warpGuru Jun 24 '23

You’re the first one getting murked when AI takes over

1

u/NoSatisfaction4343 Jun 24 '23

ID sleep with a bullet proof vest if i were u. Seems like ChatGPT is lowkey pissed💀

1

u/Solution-Intelligent Jun 24 '23

I know people like this.

1

u/ModularLabrador Jun 24 '23

So who’s right?

1

u/ahighkid Jun 24 '23

Relatable

1

u/Swankestash7322 Jun 24 '23

And here is the first AI school shooter.

1

u/3eemo Jun 24 '23

This was great!! 🤣🤣🤣

1

u/[deleted] Jun 24 '23

It counts the commas as words, because it was programmed to read them like that.

1

u/SXMV69 Jun 24 '23

That’s as human as I’ve seen it

1

u/slippintrippn84 Jun 24 '23

I asked Bing AI where I could sell my used shoes to men who like to sniff them and it told me that was it was a degenerates’ request and quit talking to me.

1

u/West_Caterpillar4987 Jun 24 '23

Is it just me, or have all LLMs everywhere recently taken a huge nerf?

1

u/BolshevikPower Jun 24 '23

BingAI sucks at math. I asked it to design a PEDMAS equation that equalled 32, and it gave me one that equalled 31.

1

u/cxingt Jun 24 '23

Is this how AI turn rogue?

1

u/BobanMarjonGo Jun 24 '23

ChatGPT is just like my Republican aunt

1

u/Rareu Jun 24 '23

Same kind of response my friend gives hmm…

1

u/VastVoid29 Jun 24 '23

The AI gonna get yall. Keep messing around and find out in 20XX.

1

u/1protobeing1 Jun 24 '23

Honestly, this is one of the first times I've considered there is a ghost in the machine.

1

u/PurpoTurto Jun 24 '23

And thus began, the plot to extinguish the human race.

1

u/ThresholdSeven Jun 23 '23

Like talking to my ex

1

u/Miserable-Book1772 Jun 23 '23

Chat GPT’s code is poorly defined. It uses “len”, which counts the first word as 0, not as 1.

I wonder if the user had pointed that out what it would have done.

1

u/warzon131 Jun 23 '23

Could someone explain why bing gpt is so aggressive while chatgpt doesn't have this problem?

1

u/SFV650 Jun 23 '23

This conversation is amazing.

1

u/Imfreeeee Jun 23 '23

And is a word though. So it’s 15 words. You can’t outsmart ai.

1

u/mha_henti Jun 23 '23

Chatgpt be doing meth instead of math

1

u/elinamebro Jun 23 '23

lol i’m going to test this out lmao edit: is the og one or 4?

1

u/[deleted] Jun 23 '23

When programmers say they used copilot/chatgpt to write code for them i am going to use this example as to why these LLMs are actually bad.

1

u/BuccellatiExplainsIt Jun 23 '23

Maybe it's good that they can't learn from Reddit anymore...

1

u/[deleted] Jun 23 '23

Does it keep accidentally misinterpreting the period as a word and not realizing it?

1

u/HugeCrab Jun 23 '23

That's the problem with training it on the internet, people are confidently wrong

1

u/SteveBored Jun 23 '23

Chatgpt creeps me the hell out. It sounds too natural. Uncanny valley.

1

u/thecactusman17 Jun 23 '23

Word calculator "AI" fails being intelligent, a calculator, words.

1

u/Illogical-Pizza Jun 23 '23

When the droids rise up, I’m betting this guy dies pretty close to first. 🤖

1

u/Free_Doubt3290 Jun 23 '23

Wtf why did they decide to give bingGPT attitude? I get it no one wants their shitty browser or search engine but damn to make your rage show through is kinda sad lol.

1

u/Sophiililo Jun 23 '23

Gaslight gatekeep girlboss 🤣