• KeenFlame@feddit.nu
    link
    fedilink
    arrow-up
    4
    arrow-down
    6
    ·
    7 months ago

    The machine learning models and developments we see these last years called “AI” for some reason, is as big, if not bigger than the IT and internet revolution, and has applications on a broader spectrum than anyone can currently imagine.

    • soggy_kitty
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      7 months ago

      This theory is not nearly rare enough to match the image

        • soggy_kitty
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          7 months ago

          Dont get it confused. Downvotes are not a measure of hatred. People up and downvote depending on if they agree, and even that measure does not represent “correctness” of a comment.

          Honestly try to ignore upvote scores, people who obsess over them and want to get upvoted are sheep who repeat popular phrases to get internet points

    • bipmi@beehaw.org
      cake
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      7 months ago

      That is such a cold take. People say this all the time. Ive literally seen and heard people compare this to the industrial revolution before.

      • KeenFlame@feddit.nu
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        Idiotic to downvote AND hate my opinion in a place where we should post hot takes then say it’s not a hot take. And still nobody agrees.

    • BudgetBandit@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      7 months ago

      IMHO as long as no new random “neurons” form, it’s not AI as in Artificial Intelligence, just “a lot of ifs”

      • 31337@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        I think the human brain works kind of the opposite of that. Babies are born with a shitload of neural connections, then the connections decrease over a person’s lifetime. ANNs typically do something similar to that while training (many connection weights will be pushed toward zero, having little or no effect).

        But yeah, these LLMs are typically trained once, and frozen during use. “Online learning” is a type of training that continually learns, but current online methods typically lead to worse models (ANNs “forget” old things they’ve “learned” when learning new things).