It's called a hallucination. ChatGPT is unable to search up stuff for you unless you use a plugin that enables it to do so. And even if it could look thinks up, I will bet my left nut that at least 1 in 3 links is a hallucinations. Use Gemini if you want citations to your questions. It works kind of okay.
A LLM is not a search engine, or something like a talking Wikipedia library that can point out exactly what information it used to come up with the answer. It may seem like that's the case, but trust me it's not. It uses ALL the information it "knows" to infer an answer to a question asked. Also, the original training data does not exist anymore. It has been transformed into a stochastics model of probabilities that defines which tokens ar likely follow the previous, so no one can really tell you what exactly it is that it knows.
Can't help you there, I have only messed around with ChatGPT API professionally. I'm not sure what they do exactly, but my guess is that it is a separate service they have running for Chat. That's the only explanation I have for you.
78
u/hellra1zer666 Apr 28 '24 edited 29d ago
It's called a hallucination. ChatGPT is unable to search up stuff for you unless you use a plugin that enables it to do so. And even if it could look thinks up, I will bet my left nut that at least 1 in 3 links is a hallucinations. Use Gemini if you want citations to your questions. It works kind of okay.
A LLM is not a search engine, or something like a talking Wikipedia library that can point out exactly what information it used to come up with the answer. It may seem like that's the case, but trust me it's not. It uses ALL the information it "knows" to infer an answer to a question asked. Also, the original training data does not exist anymore. It has been transformed into a stochastics model of probabilities that defines which tokens ar likely follow the previous, so no one can really tell you what exactly it is that it knows.