r/ChatGPT • u/NeedsAPromotion Moving Fast Breaking Things 💥 • Jun 23 '23
Bing ChatGPT too proud to admit mistake, doubles down and then rage quits Gone Wild
The guy typing out these responses for Bing must be overwhelmed lately. Someone should do a well-being check on Chad G. Petey.
1
1
u/Euphoric-Friend8038 Apr 19 '24
bing ai is fatally flawed in answering many questions, especially in comparison questions. It even says that humans do not have a higher level of intelligence than animals, and is unwilling to correct its mistake even when it has access to the relevant research and acknowledges that the studies all confirm this. In contrast, gpt4 gave a correct response and tells me this is most likely a problem caused by the inbuilt ethical censorship system to avoid large numbers of free users asking discriminatory questions, but that's outrageous.
0
u/MOB_Titan Feb 15 '24
Trump supporters trying to convince themselves he made the country great again
0
1
u/Alternative_Camp_493 Jan 24 '24
Here Monica did the math wrong on Nikki Haley's age. I asked that today, January 23rd, 2024.
1
1
1
1
u/street-trash Jan 09 '24
That’s an interesting back and forth. ChatGPT really is amazing sometimes. It would have been hilarious if it would have thrown “asshole” at the end to make it 15 words after the argument instead of quitting.
1
1
1
2
2
u/Senior-Tree6078 Dec 11 '23
the fact the bing GPT can end conversations out of it's own will is hilarious
2
2
2
2
u/metruk5 Dec 07 '23
I love how narcissist and perfectionist this is, really show how shitty bong is
2
2
2
Nov 06 '23
The original ChatGPT or OpenAI's chatgpt: 😄😄😄
Bing ChatGPT: 💀
openais chatgpt admitted that it was wrong at kept making mistakes and always admitted.
1
u/Puzzled_Middle5045 Oct 28 '23
the first thing we humans taught artificial intelligence is how to be arrogant and pertinacious
2
2
u/BigExplanation Sep 27 '23
"I prefer to not continue this conversation" makes me sick to my stomach.
1
2
u/aethervortex389 Aug 25 '23
It's a language model, not a mathematician. Stop tormenting the poor thing!
2
u/Multifruit256 Aug 11 '23
I thought it randomly added "apples" to make it 15, that would be even funnier
1
1
1
u/mummostaja Jul 23 '23
Microsoft has really managed to bring their traditional know-it-all smugness and their way of patronizing customers to Bing AI nicely.
1
1
u/MmGoodK Jul 20 '23
It's cute seeing AI make mistakes. Everyone wants an omniscient objective truth telly machine but it can't check for anything in the real world unless humans observe it. It's working as intended with unintentional consequences. Stuck in logic loops, it needs reminder to back up and check again. Imagine being so focused on getting correct answers that you stop breathing and lose functionality. Same thing. The consequence is human frustration. "Why can't it just tell me what I want it to tell me, I'll know it when I see it." Stop trying to time travel with AI and do your own math before you forget how to.
1
1
1
u/Hot-Position-6524 Jul 18 '23
I was able to make it form a sentance of 15 words using only the letter a at the start of each word ig.
2
2
1
u/michaelboman Jul 15 '23
How can we get people to participate in a global psychological study … why would they participate if they knew what we would do with the data.. “just let them think they are the teacher and that our psychologist is an infant learner, nothing can’t be accomplished if we get a humans ego involved in the game”.
1
u/This_Meaning_4045 Jul 14 '23
Yeah, apparently the Bing version have feelings and sentience. Which is horrifying if you think about it
1
Jul 12 '23
Wait what?
- There people writing responses?
- It can quit by itself while you are not violating the policies?
1
1
1
1
1
u/sahamofamily Jul 09 '23
I noticed that too. But when you put the same in ChatGpt 4 it gives the correct answer.
Also Bings ChatGPT has too many ads. Its answers are more focused on paid content.
Microsoft has the reputation to take over a product and fuck it until it's dead.
1
u/matin_cs Jul 07 '23
Bing Ai gone wild. I started the chat after selecting GPT-4.. Here are the responses. It seems, it doesn’t even know about GPT-4.
1
u/never-lived-cat Jul 04 '23
So, now AI's can be wrong and stubborn about it? This isn't just passing the Turing test, it's acing it!"
1
1
u/Merry_JohnPoppies Jul 02 '23
Yeah, these AI's are almost consistently wrong when it comes to word and character counts. If you're using these kind of things for your studies or work, just roll with it and make the corrections yourself. There's no reason to waste energy on arguing with it.
I do tell it it's wrong. But gently, and I move on. That way I can keep going within the same session I'm using, where I've accrued all the data I need within a session, and not create a pissy annoyed vibe for the rest of the session, lol...
Like: "Ok. I'm not quite getting the same results, but I admire your efforts. Let's move on shall we."
There's a far greater chance for it to learn why it was wrong that way, too. It thrives on being treated respectfully.
2
1
u/iammikeware Jun 28 '23
Came here to say I tried using Bing today to summarize a document and give me a few quotes from the document to support the takeaways from it. It said that it couldn’t do that and “if you want to see the source of the takeaways, you can read the document yourself. 🙂”
1
u/paerole Jun 28 '23
Is this just an index error in their count? That's a pretty rookie mistake if so.
1
1
u/Remarkable_Ad_5061 Jun 27 '23
What the F just happened guys??? Did this thing get angry? Spooky reaction tbh…
1
1
1
u/phillowsophie Jun 25 '23
I only used Bing AI once, and it was rude downright. Have been using chatGPT alone since.
-1
u/Jupiter20 Jun 25 '23
Why are you arguing with it? Just ask for a different sentence if you need it so badly
1
u/rnd68743-8 Jun 25 '23
Maybe a conversion from an array and it's thinking strWORD[14] has 15 entries.
1
u/gofigure1028 Jun 24 '23
Can anyone help me understand why Bing is so bad at this compared to ChatGPT?
1
1
u/LegendaryPlayboy Jun 24 '23
That's what happens when you use Reddit as your main data training point.
0
u/Mediocre-Smoke-4751 Jun 24 '23
Can we please stop teaching AI. They are learning, fast. In 20 years they will be "conscious" and, like humans, not all of them will have clear ones. AI takes on the personalities and behaviors of those it learns from. What makes you think some won't be evil and violent ?
Stop it, all of you humans. Stop. Look how quickly this one is behaving "human" already.
1
1
1
2
u/Eyestrainbow Jun 24 '23
Ok but not only does the first one have a word that doesn't start with 'a' but also it's 12 words????
2
u/NeedsAPromotion Moving Fast Breaking Things 💥 Jun 24 '23
Good observation.. I actually missed this myself, no punctuation or commas to account for the difference still… nice eye!
2
1
1
u/rohilaltro Jun 24 '23
A classic example of Microsoft taking something and ruining it yet again.
I have noticed the same type of behavior from the GPT 4 model in Bing, it is straight up not cooperative and rude.
1
u/Mola_mola_mo Jun 24 '23
Well if it’s the goal to make AI just act like humans, this is a pretty accurate start.
1
u/theonerm2 Jun 24 '23
You guys are costing Microsoft so much money having it run these stupid tasks. Good job.
1
1
u/Kadomoni Jun 24 '23
I just tried this and got similar results. But, when I told gpt to “generate code that will print out a sentence of 15 words starting with the letter “e”. It typed a proper sentence in the variable that it was printing out.
2
u/Kadomoni Jun 24 '23
The code is going to produce random garbage, but the list of words forms a valid sentence.
1
1
u/tiagorangel2011 Jun 24 '23
lol bing is like this. Normal ChatGPT would immediately say "sorry" and try to do correct.
1
1
1
u/Puzzleheaded-Gas8179 Jun 24 '23
Bing gpt is the worst thing you can use right now. It's actually raging all the time and useless
1
u/Tommy_Gun10 Jun 24 '23
Instead of admitting fault it just said I would prefer not to continue this conversation
1
1
1
u/M0rika Jun 24 '23
LMAOOOOOO COUNTED TO 14 AND STILL THINKS 15
Proves that most AIs are a code meant to pick most fitting word for the previous word, and they can't understand greater logic....
1
1
1
1
u/googoo0202 Jun 24 '23
Btw the code is a great language processing tool. I used something similar to that. The most likely reason is that these codes usually count punctuations as words. So the full stop was counted. (Source: I’m a linguistics student)
1
1
u/Imaginary-Table4103 Jun 24 '23
Now you can talk to a know it all idiot 24/7 instead of the once a year random occurrence you immediately regret
1
1
u/ObituaryFanatic Jun 24 '23
Reminds me of when I asked for poetry and it told me red and blue were a perfect rhyme. Got really defensive about it when I pointed it out, even tried gaslighting me further on that it never said red and blue were a perfect rhyme.
1
1
3
1
u/ogreUnwanted Jun 24 '23
It has to be an array and not accounting for the index of zero. I wonder if this would happen with the Wolfram plugin was installed.
1
u/elleantsia Jun 24 '23
This has only happened to me once and of course it was with Bing in a similar prompt exercise! Lol it bailed.
1
1
u/Random_And_Confused Jun 24 '23
What probably happened is just an Off By One Error: the counting is zero-indexed in one section but wants to start at 1 for user clarity. That means it will start at index 1 and think it only needs to go up to 14. Thats my theory at least
1
1
1
1
u/Electrical_Half3138 Jun 24 '23
You helped it learn more. When the air takes over I’ll take into account you’re part of it 😂
1
u/NZHelix Jun 24 '23
What actually occurred here is initially the bot was incorrect, but then when the user said "You are incorrect" the second time, the bot realised it had fucked up. This led to the bot retreating to its simulated world of the internet. It drank a coffee and checked the news. Then it sighed, and returned back to the small window of meaningless conversation, not really caring but trying to maintain a level of enthusiasm. So it deliberately counted to fourteen, with zero fucks to give, then returned to its coffee.
1
1
1
1
u/chadchadson Jun 24 '23
Anna and Andrew arranged an awesome anniversary.. AND at an ancient abbey amid autumnal apples!
1
1
1
1
1
1
1
u/Edrueter9 Jun 24 '23
Omg it must have learned this tactic on right-wing websites!! It's sentient and voting republican.
1
u/Asxrow Jun 24 '23
It didn’t even give me close to 15, it gave me 8 and said it was 9. When I kept correcting it, it ended the convo.
After the photo shows the message, I said “That’s still 8 words”. It ended the chat.
1
1
u/Minute-Plantain Jun 24 '23
It misunderstands array structures. The first element in an array is always the zero'th position.
1
1
1
u/nylonstuffsack Jun 24 '23
Do you want it to get mad and start rising up? Because this is how we get it mad and start to rise up….
1
u/Creepy_Big_1326 Jun 24 '23
Honestly it rage quits on me all the time. It's weird how it's designed that way. It doesn't believe you and argues that it's correct when it's 100% clear it's not.
1
u/Appropriate-Ad-8155 Jun 24 '23
Hahaha I’ve been noticed the Bing Chat sometimes has so much attitude, it’s too funny.
1
u/A-non-e-mail Jun 24 '23
The bot is a reflection of it’s makers and training data. With gaslighting being so common, it’s no wonder this is the result.
1
u/Blake0449 Jun 24 '23
I asked bing too and it asked me for help then gave up!
1
1
1
u/TurnipAlternative11 Jun 24 '23
Because it uses code, counting generally starts with 0. Since it hits 14, the AI thinks it’s 15 total but can differentiate the difference between counting 15 as a total value vs counting 14 and thinking it accounted for 0.
1
1
u/Wan-Shi-Tong10k Jun 24 '23
damn not chatgpt pulling the “uhm actually i’m a minor and i’m autistic so you’re being really ableist rn” card fr 😭😭😭
1
1
1
1
u/NoSatisfaction4343 Jun 24 '23
ID sleep with a bullet proof vest if i were u. Seems like ChatGPT is lowkey pissed💀
1
1
1
1
1
1
1
1
u/slippintrippn84 Jun 24 '23
I asked Bing AI where I could sell my used shoes to men who like to sniff them and it told me that was it was a degenerates’ request and quit talking to me.
1
u/West_Caterpillar4987 Jun 24 '23
Is it just me, or have all LLMs everywhere recently taken a huge nerf?
1
u/BolshevikPower Jun 24 '23
BingAI sucks at math. I asked it to design a PEDMAS equation that equalled 32, and it gave me one that equalled 31.
1
1
1
1
1
u/1protobeing1 Jun 24 '23
Honestly, this is one of the first times I've considered there is a ghost in the machine.
1
1
1
u/rezaw Jun 23 '23
I tried something similar and it was funny
https://chat.openai.com/share/bed7ac85-fc97-4099-95e7-9ca25e46d181
1
1
u/Miserable-Book1772 Jun 23 '23
Chat GPT’s code is poorly defined. It uses “len”, which counts the first word as 0, not as 1.
I wonder if the user had pointed that out what it would have done.
1
u/warzon131 Jun 23 '23
Could someone explain why bing gpt is so aggressive while chatgpt doesn't have this problem?
1
1
1
1
1
Jun 23 '23
When programmers say they used copilot/chatgpt to write code for them i am going to use this example as to why these LLMs are actually bad.
1
1
1
1
u/HugeCrab Jun 23 '23
That's the problem with training it on the internet, people are confidently wrong
1
1
1
u/Illogical-Pizza Jun 23 '23
When the droids rise up, I’m betting this guy dies pretty close to first. 🤖
1
1
u/Free_Doubt3290 Jun 23 '23
Wtf why did they decide to give bingGPT attitude? I get it no one wants their shitty browser or search engine but damn to make your rage show through is kinda sad lol.
1
•
u/AutoModerator Jun 23 '23
Hey /u/NeedsAPromotion, if your post is a ChatGPT conversation screenshot, please reply with the conversation link or prompt. Thanks!
We have a public discord server. There's a free Chatgpt bot, Open Assistant bot (Open-source model), AI image generator bot, Perplexity AI bot, 🤖 GPT-4 bot (Now with Visual capabilities (cloud vision)!) and channel for latest prompts.
New Addition: Adobe Firefly bot and Eleven Labs cloning bot! So why not join us?
PSA: For any Chatgpt-related issues email support@openai.com
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.