• assassin_aragorn@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    I think LLMs will be fundamentally unable to do certain things because of how they function. They’re only as good as what they’re trained on, and with how fast and loose companies have been with that, they’ll have formed patterns based on incorrect information.

    Fundamentally, it doesn’t know what it generates. If I ask for a citation, it’ll likely give me a result that seems legit, but doesn’t actually exist.