• CanadaPlus@lemmy.sdf.org
    link
    fedilink
    arrow-up
    21
    ·
    edit-2
    1 year ago

    So was all this bloat inevitable as hardware got better, or is there a way to go back? It feels like a ripoff that our computers are 1000x better but they’re maybe 10x faster once all the shitty software is taken into consideration.

    • mea_rah@lemmy.world
      link
      fedilink
      arrow-up
      21
      arrow-down
      1
      ·
      1 year ago

      Perhaps it’s kind of inevitable to have some bloat. For example apps these days handle most of the languages just fine including emoji, LTR/RTL and stuff. Some have pretty decent accessibility support. They can render pretty complicated interface at 8k screen reasonably fast. (often accelerated in some way) There is a ton of functionality baked in - your editor can render your html or markdown side by side with source code as you edit it. You have version control, terminal emulator, language servers, etc…

      But then there’s Electron, which just takes engine capable of rendering anything and uses it to render UI, so as a result there’s not much optimization you can do. Button is actually a bunch of DOM elements wrapped in CSS… Etc… It’s just good enough for the “hardware is cheap” approach.

      I think Emacs is a good example to look at. It has a ton of built in functionality and with many plugins (either custom configuration or something like Doom Emacs) you can have very capable editor very comparable to the likes of VS Code. Decades back Emacs had this reputation of being bloated, because it used Megabytes of RAM. These days it’s even more “bloated” due to all the stuff that was added since. But in absolute numbers it does not need as much resources as its Electron based peers. The difference can easily be order of magnitude or more depending on configuration.

      • MyFairJulia@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I am working on an application using DevExpress XAF. It allows you to build a big enterprise application relatively quick by doing a lot of the dirty work you would otherwise do yourself for CRUD stuff. A lot of the application can be modified through mere clicks without touching a single line of code.

        It is cool but kinda bloaty. When you simply launch an XAF application, it uses 300 megabytes of RAM. And that’s before you even loaded a single byte of business data. You have just reached the login screen.

        At least i felt it was “kinda bloaty” until i first booted Void Linux on my gaming PC at home and took a look at htop. IT’S ONLY 400 MEGABYTES AND IT’S READY TO USE! MAYBE ADD 200 MEGABYTES FOR KDE!

        ALL THIS BLOATING CANNOT CONTINUE! WE HAVE TO TAKE ACTION IMMEDIATELY OR WE WILL BE FOREVER DOOMED TO UPGRADE OUR RAM LEST WE DON’T WANT EVEN A FUNCTIONING TEXT EDITOR!

    • ZILtoid1991@kbin.socialOP
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I have a few suggestions:

      1. Better education. Don’t scare people who’re learning programming away from the lower-level stuff, especially as people are even getting scared to use type declarations, not just the pointers (of which I was fearmongered with in college, as they told me Java is the future).
      2. Better portable APIs. Thanks to WebAssembly, one could easily have both something portable in a web browser and as a native desktop app, except instead we get browsers running said applications. I had some thinking about such a project, but then I remembered my iota project (a D-native replacement of SDL/SFML/GLFW, but without bloat by including standard library features), and then stopped thinking about it immediately, since a much smaller project already causes me too much headache. (Someone has a handy guide on win32 API? I have issues on getting certain messages produced, like input language change, and I don’t know if I glimpsed over some functions that enable them and just weren’t included in the documentation of the input language change event codes.)
      • CanadaPlus@lemmy.sdf.org
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        1 year ago

        You know, I haven’t worked on a super big project, but I feel like every time I’ve gotten a type error in a static language it’s pointed to something wrong with my underlying reasoning.

      • ZILtoid1991@kbin.socialOP
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        In the short run, yes. In the long run, this just makes a bunch of coders that are now afraid of type declarations, because they were scared away from it with the “what if you have to choose?” tagline, thus making turning back to the proper way of doing things harder.

        • scubbo@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Can you talk more about this? I’ve never heard that tagline and can’t figure out what it’s supposed to mean.

          • CanadaPlus@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Just from context, I’m guessing it means that you might type things one way and then need to use them another way later, and dynamically typed languages are sold as not having that problem.

      • CanadaPlus@lemmy.sdf.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I was thinking about this a bit. Does that mean you can develop a piece of software much more cheaply now? I have a fear that companies writing software get a 10% discount from writing bloat, while clients wind up using 10,000% the resources and are just so used to it they don’t complain.

    • renlok@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      It’s not really inevitable, it’s just a consequence that develops can get away with being lazy because the hardware can cope with it.