misk to Technology@lemmy.worldEnglish · 7 months agoWe have to stop ignoring AI’s hallucination problemwww.theverge.comexternal-linkmessage-square197fedilinkarrow-up1525arrow-down129
arrow-up1496arrow-down1external-linkWe have to stop ignoring AI’s hallucination problemwww.theverge.commisk to Technology@lemmy.worldEnglish · 7 months agomessage-square197fedilink
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down2·7 months agoWtf are you even talking about.
minus-squareUnsavoryMollusk@lemmy.worldlinkfedilinkEnglisharrow-up2·edit-27 months agoThey are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
minus-squareCyberflunk@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·7 months agoYour 1 sentence makes more sense than the slop above.
Wtf are you even talking about.
They are right though. LLM at their core are just about determining what is statistically the most probable to spit out.
Your 1 sentence makes more sense than the slop above.