• doughless@lemmy.world
    link
    fedilink
    arrow-up
    42
    ·
    2 months ago

    A comment on the YouTube video makes a good point that we already have a better word for the concept of dealing with multiple things at once: multitasking. Using a word that literally means “things happening at the same time” just adds to the confusion, since people already have a difficult time understanding the distinction between multitasking and concurrency.

    • FizzyOrange@programming.dev
      link
      fedilink
      arrow-up
      22
      arrow-down
      1
      ·
      2 months ago

      Yeah it always bothered me that they’re saying “concurrency is not concurrency”.

      I’m going to start using “multitasking” instead. That’s so much better. Who’s with me?

      • doughless@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        2 months ago

        I will typically use the terms asynchronous and parallel when discussing the concepts, but I hadn’t thought about using multitasking until I saw that comment. I mean, even C# calls them “tasks”.

    • lysdexic@programming.devOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      A comment on the YouTube video makes a good point that we already have a better word for the concept of dealing with multiple things at once: multitasking.

      I don’t think that’s a good comment at all. In fact, it ignores fundamental traits that separate both concepts. For example, the concept of multitasking is tied to single-threaded task switching whereas concurrency has a much broader meaning, which covers multi threaded and multiprocess execution of many tasks that may or may not yield or be assigned to different cores, processors, or even nodes.

      Meaning, concurrency has a much broader meaning that goes well beyond “doing many things at once”. Such as parallelism and asynchronous programming.

    • FrostyPolicy@suppo.fi
      link
      fedilink
      arrow-up
      6
      arrow-down
      10
      ·
      edit-2
      2 months ago

      A cpu (core) can only do one thing at a time. When you have multiple cores you can do multiple things at the same time. Multitasking in programming sense is a bad term, it’s a term more for the masses.

      Bit simplified:

      • concurrency: you seem to be doing multiple things at the same time. In reality they are run little by little one after another. Doesn’t really speed things up.
      • parallelism: you actually run multiple things at the same time (multiple cpus/cpu cores required). If the code scales properly or is designed to truly run in parallel the speed up is relative to the number of cpus available.

      Edit: It’s much more complex subject then I’ve presented here.

      • FizzyOrange@programming.dev
        link
        fedilink
        arrow-up
        22
        ·
        2 months ago

        You missed the point. He understands all these things you tried to explain. The point is that your definition of the word “concurrency” is objectively wrong.

        You:

        you seem to be doing multiple things at the same time. In reality they are run little by little one after another

        The actual meaning of the word “concurrency”:

        The property or an instance of being concurrent; something that happens at the same time as something else.

        Wiktionary actually even disagrees with your pedantic definition even in computing!

        (computer science, by extension) A property of systems where several processes execute at the same time.

        I suspect that concurrency and parallelism were actually used interchangeably until multicore became common, and then someone noticed the distinction (which is usually irrelevant) and said “aha! I’m going to decide that the words have this precise meaning” and nerds love pedantic "ackshewally"s so it became popular.

        • nous@programming.dev
          link
          fedilink
          English
          arrow-up
          12
          ·
          2 months ago

          I suspect that concurrency was used back when there were only single threaded cpus, when process scheduling became a thing, to talk about the difference between running one process after another vs interleaving the processes so they appear to be concurrent. Then once true multithreaded programs became a thing they needed a new word to describe things happening at the exact same time instead of only appearing to.

        • FrostyPolicy@suppo.fi
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          Wikpedia puts it nicely:

          "The concept of concurrent computing is frequently confused with the related but distinct concept of parallel computing,[3][4] although both can be described as “multiple processes executing during the same period of time”. In parallel computing, execution occurs at the same physical instant: for example, on separate processors of a multi-processor machine, with the goal of speeding up computations—parallel computing is impossible on a (one-core) single processor, as only one computation can occur at any instant (during any single clock cycle).[a] By contrast, concurrent computing consists of process lifetimes overlapping, but execution does not happen at the same instant. "

          • FizzyOrange@programming.dev
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            You’re still missing the point. We all understand that definition. We’re just saying that it is incorrect use of the word “concurrent”. Does that make sense? The word “concurrent” means things happening at the same time. It’s stupid for programmers to redefine it to mean things not happening at the same time.

      • BB_C@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        2 months ago

        With hyper-threading and preemption in mind, maybe it’s concurrency all the way down 😎 . But we should definitely keep this on the down low. Don’t want the pesky masses getting a whiff of this.

  • platypus_plumba@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    4
    ·
    2 months ago

    Do we really need a video about this in 2024? Shouldn’t this be already a core part of our education as software engineers?

    • lysdexic@programming.devOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      Do we really need a video about this in 2024? Shouldn’t this be already a core part of our education as software engineers?

      I’m not sure what point you tried to make.

      Even if you believe some concept should be a core part of the education of every single software engineer who ever lived, I’m yet to meet a single engineer who had an encyclopedic knowledge of each and every single topic covered as a core part of their education. In fact, every single engineer I ever met only retained a small subset of their whole curriculum.

      So exactly what is your expectation?

      • platypus_plumba@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        2 months ago

        My expectation is that this is something core that programmers should be aware of all the time. Forgetting about this is like forgetting what an interface is. It’s at the core of what we do. At least I think so, maybe I’m wrong assuming this is something every programmer should be aware of all the time.

    • NostraDavid@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 months ago

      Shouldn’t it? Yes, just like the ability to unit test, but that doesn’t stop schools from skipping over them either.