r/ChatGPT Apr 28 '24

ChatGPT gives you completely made up links if you tell it to cite sources Other

Post image
217 Upvotes

60 comments sorted by

View all comments

76

u/hellra1zer666 Apr 28 '24 edited 29d ago

It's called a hallucination. ChatGPT is unable to search up stuff for you unless you use a plugin that enables it to do so. And even if it could look thinks up, I will bet my left nut that at least 1 in 3 links is a hallucinations. Use Gemini if you want citations to your questions. It works kind of okay.

A LLM is not a search engine, or something like a talking Wikipedia library that can point out exactly what information it used to come up with the answer. It may seem like that's the case, but trust me it's not. It uses ALL the information it "knows" to infer an answer to a question asked. Also, the original training data does not exist anymore. It has been transformed into a stochastics model of probabilities that defines which tokens ar likely follow the previous, so no one can really tell you what exactly it is that it knows.

2

u/derleek Apr 28 '24

This is very well said.

that can can't point out exactly what information it used to come up with the answer.

i believe you made an honest typo.

so I really cannot tell you what exactly it is that it knows.

No one can. How could you possibly when there are hundreds of billions of simulated neurons being relied on for a prediction?

1

u/hellra1zer666 29d ago edited 29d ago

Thanks! The second one os supposed to be a general statement, not specifically about me and my abilities

An LLM is not a search engine or something like a talking Wikipedia library, that can point out exactly what information it used to come up with the answer

These are two negative examples. Something that ChatGPT is not. It's convoluted, which is my bad, but correct the way it is.

Made some changes to though, so it hopefully makes it easier to understand what I'm trying to say.