cm0002@lemmy.world to Technology@lemmy.zipEnglish · 11 months agoChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comexternal-linkmessage-square16fedilinkarrow-up166arrow-down12cross-posted to: technology@lemmy.ml
arrow-up164arrow-down1external-linkChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comcm0002@lemmy.world to Technology@lemmy.zipEnglish · 11 months agomessage-square16fedilinkcross-posted to: technology@lemmy.ml
minus-squareOptional@lemmy.worldlinkfedilinkEnglisharrow-up25arrow-down1·11 months ago*raises hand* Because it never “understood” what any “word” ever “meant” anyway?
minus-squaregeekwithsoul@lemm.eelinkfedilinkEnglisharrow-up11·11 months agoYeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.
*raises hand*
Because it never “understood” what any “word” ever “meant” anyway?
Yeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.