• crystalmerchant@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    6 months ago

    Of course they can’t. Any product or feature is only as good as the data underneath it. Training data comes from the internet, and the internet is full of humans. Humans make and write weird shit so so the data that the LLM ingests is weird, this creates hallucinations.