• WhatAmLemmy@lemmy.world
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    edit-2
    9 months ago

    Give Moore’s law several more cycles, and maybe we’ll have enough computing power to make drop in replacement humans.

    There seems to be a misunderstanding of how LLM’s and statistical modelling work. Neither of these can solve their accuracy as they operate based on a probability distribution and only find correlations in ones and zeros. LLM’s generate the probability distribution internally, without supervision (a “black box”). They’re only as “smart” as the human-generated input data, and will always find false positives and false negatives. This is unavoidable. There simply is no critical thought or intelligence whatsoever — only mimicry.

    I’m not saying LLM’s won’t shakeup employment, find their niche, and make many jobs redundant, or that critical general AI advances won’t occur, just that LLM’s simply can’t replace human decision making or control, and doing so is a disaster waiting to happen — the best they can do is speed up certain tasks, but a human will always be needed to determine if the results make (real world) sense.