• MentalEdge
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    6 months ago

    There’s also the fact that they can’t tell reality apart from fiction in general, because they don’t understand anything in the first place.

    LLMs have no way of differentiating fantasy RPG elements from IRL things. So they can lose the plot on what is being discussed suddenly, and for seemingly no reason.

    LLMs don’t just “learn” facts from their training data. They learn how to pretend to be thinking, they can mimic but not really comprehend. If there were facts in the training data, it can regurgitate them, but it doesn’t actually know which facts apply to which subjects, or when to not make some up.

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      6 months ago

      They learn how to pretend

      True, and they are so darn good at it, that it can be somewhat confusing at times.
      But the current AIs are not the ones we read about in SciFi.