Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 6 months agoTim Cook is “not 100 percent” sure Apple can stop AI hallucinationswww.theverge.comexternal-linkmessage-square146fedilinkarrow-up1409arrow-down120cross-posted to: aicompanions@lemmy.world
arrow-up1389arrow-down1external-linkTim Cook is “not 100 percent” sure Apple can stop AI hallucinationswww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 6 months agomessage-square146fedilinkcross-posted to: aicompanions@lemmy.world
minus-squarekaffiene@lemmy.worldlinkfedilinkEnglisharrow-up41arrow-down1·6 months agoI’m 100% sure he can’t. Or at least, not from LLMs specifically. I’m not an expert so feel free to ignore my opinion but from what I’ve read, “hallucinations” are a feature of the way LLMs work.
minus-squarerottingleaf@lemmy.ziplinkfedilinkEnglisharrow-up10arrow-down1·6 months agoOne can have an expert system assisted by ML for classification. But that’s not an LLM.
I’m 100% sure he can’t. Or at least, not from LLMs specifically. I’m not an expert so feel free to ignore my opinion but from what I’ve read, “hallucinations” are a feature of the way LLMs work.
One can have an expert system assisted by ML for classification. But that’s not an LLM.