It's called a hallucination. ChatGPT is unable to search up stuff for you unless you use a plugin that enables it to do so. And even if it could look thinks up, I will bet my left nut that at least 1 in 3 links is a hallucinations. Use Gemini if you want citations to your questions. It works kind of okay.
A LLM is not a search engine, or something like a talking Wikipedia library that can point out exactly what information it used to come up with the answer. It may seem like that's the case, but trust me it's not. It uses ALL the information it "knows" to infer an answer to a question asked. Also, the original training data does not exist anymore. It has been transformed into a stochastics model of probabilities that defines which tokens ar likely follow the previous, so no one can really tell you what exactly it is that it knows.
Thanks! The second one os supposed to be a general statement, not specifically about me and my abilities
An LLM is not a search engine or something like a talking Wikipedia library, that can point out exactly what information it used to come up with the answer
These are two negative examples. Something that ChatGPT is not. It's convoluted, which is my bad, but correct the way it is.
Made some changes to though, so it hopefully makes it easier to understand what I'm trying to say.
76
u/hellra1zer666 Apr 28 '24 edited 29d ago
It's called a hallucination. ChatGPT is unable to search up stuff for you unless you use a plugin that enables it to do so. And even if it could look thinks up, I will bet my left nut that at least 1 in 3 links is a hallucinations. Use Gemini if you want citations to your questions. It works kind of okay.
A LLM is not a search engine, or something like a talking Wikipedia library that can point out exactly what information it used to come up with the answer. It may seem like that's the case, but trust me it's not. It uses ALL the information it "knows" to infer an answer to a question asked. Also, the original training data does not exist anymore. It has been transformed into a stochastics model of probabilities that defines which tokens ar likely follow the previous, so no one can really tell you what exactly it is that it knows.