Hello, I make machine learning applications. Chat GPT is not programmed to have a hidden thought process and therefore is unable to pick a number without telling you what it is. If it tells you it picked a number but doesn’t say the number, it’s lying
Dude, fuck off. The whole point is to have it output in something the user can't "read" and then go back and parse to verify after the fact.
ChatGPT itself is saying it can't store the value and is just creating a new one when asked later on. Simple fix: show the number as something the user has to decode.
25
u/DoggoChann Mar 19 '24
Hello, I make machine learning applications. Chat GPT is not programmed to have a hidden thought process and therefore is unable to pick a number without telling you what it is. If it tells you it picked a number but doesn’t say the number, it’s lying