• daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    16
    ·
    edit-2
    24 hours ago

    Read other replies I gave on your same subject. I don’t want to repeat myself.

    But words DO define thoughts, and I gave several examples. Some of them with kids. Precisely in kids you can see how language precedes actual thoughts. I will repeat myself a little here but you can clearly see how kids repeat a lot phrases that they just dont understand, just because their beautiful plastic brains heard the same phrase in the same context.

    Dogs and cats are not proven to be conscious as a human being is. Precisely due the lack of an articulate language. Or maybe not just language but articulated thoughts. I think there may be a trend to humanize animals, mostly to give them more rights (even I think that a dog doesn’t need to have a intelligent consciousness for it to be bad to hit a dog), but I’m highly doubtful that dogs could develop a chain of thoughts that affects itself without external inputs, that seems a pretty important part of the consciousness experience.

    The article you link is highly irrelevant (did you read it? Because I am also accusing you of not reading it as just being result of a quick google to try point your point using a fallacy of authority). The fact that spoken words are created by the brain (duh! Obviously, I don’t even know why the how the brain creates an articulated spoken word is even relevant here) does not imply that the brain does not also take form due to the words that it learns.

    Giving an easier to understand example. For a classical printing press to print books, the words of those books needed to be loaded before in the press. And the press will only be able to print the letters that had been loaded into it.

    the user I replied not also had read the article but they kindly summarize it to me. I will still read it. But its arguments on the impossibility of current LLM architectures to create consciousness are actually pretty good, and had actually put me on the way of being convinced of that. At least by the limitations spoken by the article.

    • Ageroth@reddthat.com
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      24 hours ago

      Your analogy to mechanical systems are exactly where the breakdown to comparison with the human brain occurs, our brains are not like that, we don’t only have the blocks of text loaded into us, sure we only learn what we get exposed to but that doesn’t mean we can’t think of things we haven’t learned about.
      The article I linked talks about the separation between the formation of thoughts and those thoughts being translated into words for linguistics.

      The fact that you “don’t even know why the how the brain creates an articulated spoken word is even relevant here” speaks volumes to how much you understand the human brain, particularly in the context of artificial intelligence actually understanding the words it generates and the implications of thoughts behind the words and not just guessing which word comes next based on other words, the meanings of which are irrelevant.

      I can listen to a song long enough to learn the words, that doesn’t mean I know what the song is about.

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        4
        ·
        23 hours ago

        but that doesn’t mean we can’t think of things we haven’t learned about.

        Can you think of a colour have you never seen? Could you imagine the colour green if you had never seen it?

        The creative process is more modification than creation. taking some inputs, mixing them with other inputs and having an output that has parts of all out inputs, does it sound familiar? But without those input seems impossible to create an output.

        And thus the importance of language in an actual intelligent consciousness. Without language the brain could only do direct modifications of the natural inputs, of external inputs. But with language the brain can take an external input, then transform it into a “language output” and immediately take that “language output” and read it as an input, process it, and go on. I think that’s the core concept that makes humans different from any other species, this middle thing that we can use to dialogue with ourselves and push our minds further. Not every human may have a constant inner monologue, but every human is capable to talking to themself, and will probably do when making a decision. Without language (language could take many forms, not just spoken language, but the more complex feels like it would be the better) I don’t know how this self influence process could take place.