Part of what’s making learning Linux so difficult for me, is the idea of how fragmented it is. You can install programs with sudo apt get (program). You can get programs with snaps. You can get programs with flatpaks. You can install with tar.gz files. You can install with .deb files. You can get programs with .sh files. There’s probably more I don’t know about.

I don’t even know where all these programs are being installed. I haven’t learned how to uninstall them yet. And I’m sure that each way has a different way to uninstall too.

So that brings me to my main question. Why not consolidate all this? Sure, files CAN be installed anywhere if you want, but why not make a folder like /home/programs/ where it’s assumed that programs would be installed?

On windows the programs can be installed anywhere, but the default is C:/windows/Program Files x86/ or something like that. Now, you can change it all you want when you install the programs. I could install it to C:/Fuckfuckfuck/ if I wanted to. I don’t want to, so I leave it alone because C:/Windows/Program Files x86/ is where it’s assumed all the files are.

Furthermore, I see no benefit to installing 15 different programs in 7 different folders. I begrudgingly understand why there’s so many different installation methods, but I do NOT understand why as a collective community we can’t have something like a standardized setting in each distro that you can set 1 place for all your installation files.

Because of the fragmentation of distros, I can understand why we can’t have a standardized location across all distros like Windows has. However I DON’T see why we can’t have a setting that gets set upon each first boot after installation that tells each future installation which folder to install to.

I would personally pick /Home/Programs/, but maybe you want /root/Jamies Files/ because you’re Jamie, and those are your files.

In either case, as we boot up during the install, it would ask us where we want our program files installed. And from then on, no matter what method of install you chose, it would default to whatever your chosen folder was.

Now, you could still install other places too, but you would need to direct that on a per install basis.

So what’s the benefit of having programs each installed in seperate locations that are wildly different?

  • manicdave@feddit.uk
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 hours ago

    Linux is actually kinda designed to be less fragmented than windows really.

    The reason you don’t pick an install directory is because the standard is that binaries live where binaries live, dependencies live where dependencies live, logs live where logs live, etc.

    All the user should worry about is where the media or whatever your program works with is.

    Always try to find the apt install instructions for whatever program you want, and it’s easy to uninstall with apt remove.

    Apart from a few deb packages, almost everything that can’t be managed via apt should be considered incomplete or experimental. If it was ready for you to just use it without issue, it would be in an apt repository.

    It may seem a bit daunting to have to use command line at first, but once you’re used to it, you’ll realise how absolutely broken and archaic managing software on windows is. (Like seriously, it’s 2024 and you’re still having to fish through slow or sketchy websites to find installers for tools and drivers.)

  • Goingdown
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    6 hours ago

    First of all, in Linux everyone should only use software from distribution repositories (eg. via apt command in Debian, Ubuntu, Mint, dnf/yum command in Fedora etc…). Package managers will install software in controlled way and it is really easy to remove them too. And, there is usually gui app for installing apps from distribution repositories.

    Second way is to use flatpak / snap. They are pretty much similar and will keep things easy.

    Do not install sh packages or tar.gz if you really do not know what you are doing. These are only for expert cases.

    One fundamental change coming from Windows is that in Linux, you should never worry about location where software is installed (except for those expert cases, which you should not use). They will be put in correct places always. In Linux, apps are sorted so that executables go to /usr/bin, library files to /usr/lib64 and /usr/lib, applicatoin other non-modifiable stuff to /usr/share etc. It gets quite a lot to get used to, but in long term it feels more natural than Windows way to dump everything in app directory.

    My recommendation will be to install some user friendly distribution (Ubuntu, Fedora, Mint) and just go ahead with default package management things what it offers. If you see Android way handling software good, Fedora Silverblue is kind of like that - System upgrades are handled same way, and applications are installed as flatpaks.

  • coherent_domain@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    My strategy is to always install program with flatpak, SDKs are also installed as flatpak, find graphical alternatives to command line programs. I don’t use command line a lot, so I don’t need fancy tools for it.

    I only have one system package installed for inputting unicode math symbols. So that I have a clean and easily migratable system.

  • Rimu@piefed.social
    link
    fedilink
    English
    arrow-up
    18
    ·
    13 hours ago

    What you’re seeing is the result of decades of new ways to install stuff being added at different times. As a new way is added all the old ways still need to work because getting everyone to switch to the new way is impossible. There is no central authority making these decisions, it’s more of a marketplace of ideas with different ‘sellers’ competing for attention.

    • coherent_domain@infosec.pub
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 hours ago

      I might add, everyway actually seek to “consolidate” all the older ways, and always ends up adding to the ways needing to be consolidated.

  • Jumuta@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    9 hours ago

    there’s different ways to install things because they each have their use cases in which they’re better than others (or used to have use cases)

    • binary package managers (e.g. apt): fast and lightweight because it only downloads/installs the necessary binaries

    • flatpak: can be installed on any distro, but takes up more storage space because they’re installed in a sandbox and all the dependencies are also installed with it, for every application

    • snap: same thing as flatpak but a bit worse, but some applications are only packaged for snap because canonical paid a lot of big companies to package for snap (rhey didn’t incentivise against flatpak, they just didn’t fund flatpak)

    • appimage: the ‘windows exe’ kinda thing and has all the dependencies bundled so distro agnostic, but you have to manage the appimage files yourself unless you get a manager for it and you can’t update them centrally like you can do with other stuff

    • source code repos (e.g. aur): have to compile every new version yourself on your machine, so is slow to update, but often offers things not in the binary package manager

    • .sh files for installation: idk why these are used, they’re just annoying. a lot of proprietary software from corpos use them (probably so they can verify dependencies themselves and not trust the system)

    • binariy files (e.g. .deb): same thing as with appimage except they’re not distro agnostic

    • tar.gz: is just a compressed file format like zip

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      .sh files are shell scripts, they’re comparable to Windows batch files or newer powershell scripts. They can be useful for tools with lots of dependencies which they then download on their own, so you often see them when you want to install something like LLM tools from Github or whatever. They’re easy to put together and easy to edit, even for the user itself, unlike a precompiled installer.

    • Daemon Silverstein@thelemmy.club
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      .sh files for installation: idk why these are used, they’re just annoying. a lot of proprietary software from corpos use them (probably so they can verify dependencies themselves and not trust the system)

      GOG (Good Old Games) distributes the games using a .sh file containing all the binaries and assets needed for the game. It’s strange to think of, but the binary data coexists with textual shellscript instructions, thanks to the exit instruction (which ensures that the shell won’t try to interpret the binary data) alongside some awk/grep/tail wizardry to extract the binary data from the current shellscript.

      It’s probably because .sh can run in any distro, because every distro has a shell interpreter. Also, they don’t need to be compiled (differently from .appimage, for example), it’s just a merge of a .sh and a binary archive (possibly .tar.gz).

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    15 hours ago

    Linux doesn’t really have stand alone programs like Windows. A package is a series of files that get placed in the proper places plus some optional scripting. Packages have dependencies so you can’t just run a binary from a package. The closest thing Linux as is AppImage but it lost a lot of steam.

    In Linux there are two general types of package managers. The first one is native packages. Native packages install to the root filesystem and are part if the core system.

    The second type of package manager is the portable format like Flatpak. Flatpaks can either be installed system wide or as a local user. The big difference is that they run in there own environment and have limited permissions. This is done by creating a sandbox that has its own filesystem so that it is independent of the system. This is also what makes them portable as that environment is the same no matter what.

    Technically snap packages are portable but you aren’t going to see much use outside of Ubuntu since the underlying architecture has so many flaws.

    • This is sometimes true.

      Go and Rust both (often) build single-executable binaries, often with very few (and, rarely, no) dependencies. It’s becoming more rare for developers to include proper man pages, more’s the pity, but things like man pages, READMEs, and LICENSE files are often the only assets packages from these languages include.

      If you’re installing with Cargo or go install, then even the intermediate build assets are fairly well-contained; go install hides binaries quite effectively from users who don’t know to include GOPATH(/bin) in their paths, because Go puts everything into a single tree rooted there.

      Libraries are a different matter; you get headers and usually more documentation. Interpreted languages are as you say: a great pile of shit spewed all over your system, and how bad that can be depends a lot on how many different ways you install them.

      Anyway, I’m not disagreeing with you, except that it’s a trend for newer compiled languages to build stand-alone binaries that you can usually just copy between systems and have it work.

    • Lost_My_Mind@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      14 hours ago

      Now does flatpak get it’s programs from the same place that terminal would? I’m still trying to grasp what’s even happening here. Because from my experience (limited) I like flatpaks more than any other method used so far, and am unclear why anyone would use terminal if given the choice.

      As for snaps, I heard Ubuntu owns the technology behind snaps, and for some reason everybody hates snaps because canonical owns it. Which I don’t get. As far as I know they don’t abuse snaps, and they don’t cause viruses or anything. So why would it matter who owns the technology behind them?

      • banazir@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        Now does flatpak get it’s programs from the same place that terminal would?

        I usually install Flatpaks from the terminal, but as to your question: no, the distro’s package manager and Flatpak have different repositories (servers with software packages) and formats. While distros like Fedora have their own Flatpak repositories, most people use Flathub. You can install apps as Flatpak on any distro that supports them, but native package managers generally don’t support other distros’ repositories.

        for some reason everybody hates snaps because canonical owns it.

        As I understand it, Snap server software is proprietary and doesn’t support independent repositories, so you have to install Snaps from Canonical. This is not exactly in line with Free (as in Freedom) Software principles. Canonical has done many questionable decisions in the past.

      • PetteriPano@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        13 hours ago

        everybody hates snaps because canonical owns it

        We like of like things to be open so that we can review, or replace. The snap store is proprietary and controlled by canonical. I don’t want my data collected and subject to canonical’s EULA when using my choice of distro.

        Canonical has a hisory of doing bad choices, so the level of trust is not very high. It feels like an attempt at embrace, extend, extinguish. Get people hooked on snaps and then make snaps suck on other distros kind of thing.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        13 hours ago

        https://flathub.org/

        The reason most people don’t like snaps is fairly complicated. It started with Ubuntu forcing some basic packages to install as a snap instead of a native package. The thing is snaps are not native packages and because of this it caused major problems. These days a lot of the issues have been addressed but there are still some serious design flaws. The biggest issue is that it is way overly complex and depends on a privileged daemon. The result of this is poor performance and a clunky experience.

  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    6 hours ago

    This is part of why these days I just stick to flatpaks. No fragmentation, same on any distro, I know where all the programs are going as well as all their config files.

    If I want to back up my flatpaks I can do so trivially.

    It’s a godsend. Way better than having a bunch of different formats everywhere, or the windows-style some programs installed in XYZ directory, some in program files, some in program files (x86), with config files saved literally anywhere. Maybe it’s in one of the dozens of poorly laid out appdata folders, maybe it’s where the exe is, maybe it’s in documents, maybe it’s in C:, maybe it’s hidden in my user directory, etc. I’ve even seen config files saved to bloody onedrive by default, leading to some funky app behaviour when I wasn’t connected to the internet, or when I ran out of onedrive space.

        • iopq@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          8 hours ago

          No, because docker shows up in random places in your system and takes forever to set up compared to the actual program

          There’s a few repos for online management consoles and the original version used a .sh file that installed in 30 seconds on a single core free VPS. The docker version was like two minutes and when I uninstalled it, I still have traces of docker on the VPS

  • Solumbran@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    16 hours ago

    The idea is to not manage programs by hand to avoid messes.

    The multiple solutions come from various needs, but it’s more of an underlying complexity of installation than a real intent to have so many.

    Ideally we would need one clean package manager that handles everything without ever having to tinker with installation paths and various methods.

  • Nicht BurningTurtle@feddit.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    16 hours ago

    I don’t see the issue, since you rarely have to run a program by going to the location of the binary (AppImage and others excluded).

    • Lost_My_Mind@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      16 hours ago

      I just like to understand things, and I like to be organized. Being organized helps me understand things.

      Another thing I’m not understanding is, if Android is just Linux anyways, why aren’t there PC distros that are just Android?

      I need a resource where my brain can actually ask all the questions. Youtube videos are kind of informative. My issue with them is they’re more along the lines of “here’s how to do this thing that only advanced users will even know the terms being discussed”. Whereas I’m like “Where is the uninstall button for programs?”

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        16 hours ago

        Usually right next to the install button. Or, if you used the command line, change apt install vim to apt remove vim.

        The best way to learn how to use something is, of course, the manual.

        • Lost_My_Mind@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          15 hours ago

          Is vim a command? Or are you using it as an example of a program? I’ve only heard of “sudo apt install (program)”

          • Dudewitbow@lemmy.zip
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            15 hours ago

            vim is a very common text editor. hes just using it as an example program to install/remove

            • Lost_My_Mind@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              14 hours ago

              I’m on ZorinOS actually, but I’m not sure if it’s permanent. I’m going to be buying a bunch of smaller SSD’s next month. Just trying a crapload of new distros. I haven’t landed on anything yet.

          • catloaf@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            15 hours ago

            I literally just said to read the manual. It will tell you much more than you are asking.

            • Lost_My_Mind@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              13 hours ago

              I’m still at work, so I’m not near my computer. Plus…I’m not sure which manual you mean. I didn’t mention my distro.

              • lime!@feddit.nu
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 hours ago

                “the manual” in linux always refers to the man command. run it with the name of a command as an argument an you will get a full description of how that command is used.

        • Lost_My_Mind@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          14 hours ago

          One of the big problems I see with Linux is the lack of software that people know. The response seems to often be “Well we don’t have THAT, but we have this alternative…”. And the reason for that is the big name software sticks to where they know the userbase is.

          Android on the otherhand IS where the userbase is. You’re either on iPhone, or you’re on Android. So there’s a lot of software already available, ready to run. It would work the same on both your phone and PC, since it literally is the same apk. And it will always have support, due to being one of the main ways people use phones.

          The fact that Linux HASN’T found way to use APKs and android based distros baffles me, as it already has a MASSIVE foothold into what people know. There’s so much potential there! Imagine plugging your cell phone into a desktop via a dock, or a usb cable, or even some wireless communication (not bluetooth or wifi), and suddenly your entire PC set up is actually running off your phone. For most people, their cell phone would be a good enough desktop if it had a desktop mode. I connected a keyboard with trackpad, to my Samsung A8 android tablet, and don’t feel the need for an actual laptop. I use Win-X launcher to give it a traditional desktop launcher feel, and I’m happy with it.

          • ZoDoneRightNow@kbin.earth
            link
            fedilink
            arrow-up
            5
            ·
            12 hours ago

            I don’t understand what you are saying. Android is missing a bunch of stuff that linux users rely on for a full desktop experience. They have completely different use cases. Android isn’t designed with desktop in mind, Linux is. A lot of the linux apps I rely on aren’t on android and vice versa.

            • Lost_My_Mind@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              11 hours ago

              A lot of the linux apps I rely on aren’t on android and vice versa.

              That’s exactly my point. I’m saying since it runs on the same format anyways, why NOT make it run on both? Then when someone uses an app on their phone, you could convince them to use that same app on a desktop, since they know it.

              And then, once it’s established to people that Android and Linux work together, publishers will start designing their Android apps like a hybrid. Eventually phones would just become Linux distros on a phone. And when you get home, you connect a mouse/keyboard, and switch to PC Mode. And eventually every Linux program would work on Android, and every apk would work on Linux distros.

              And both ecosystems would gain a huge amount of software.

              • lime!@feddit.nu
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 hours ago

                I’m saying since it runs on the same format anyways, why NOT make it run on both?

                it’s not the same format. android is using an old linux kernel, yes, but the two systems are not compatible at all.

                interestingly, what you’re talking about exists. it’s called samsung dex. they are cancelling it because nobody uses it.

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            13 hours ago

            You can run Waydroid for android app support. I’m not really sure I understand what you are saying. “Big name” proprietary software will never come to Linux as there is no incentive for companies to spend money on that. You technically can run pretty much any Android app on Linux but that’s a privacy nightmare

  • insomniac_lemon@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    16 hours ago

    I understand fragmentation here, as you can get what you need in a format that works well-enough.

    Different package formats often have technical differences. Recently I had the choice to use something from a flatpak to reduce lib32 dependencies on my system… but I didn’t go with that as the other dependencies it needed (openGL, graphics driver etc) were redundant thanks to sandboxing (~2GB download!).

    Anything native from itch, GOG, or humble doesn’t really ‘install’ but rather they are just extracted… so the files should be what it is (portable, except game saves/user data likely won’t be). This allows you to run it off of a slower+larger-capacity drive.

    EDIT: Also if you need to compile it, probably will also just compiled to where you put it (to a bin folder).

    Non-system stuff like this is more viable for things that you don’t need updated frequently/ever (particularly games/software post-development). For sure most-of-the-time the best experience is via your package manager.

  • Auster@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    15 hours ago

    I think that, while, yes, fragmentation hinders a system, it is also its saving grace, as it also stops a given family of systems from growing into what made the competition problematic.

    Taking the Program Files folders as example, they have limited read/write permissions on Windows, so whenever possible, I try to install them onto a folder I make in the root of C:. But more and more, since at the very least Windows XP from what I could observe, Microsoft is training users into using only the users folder, and less and less programs give an option to install elsewhere, installing only on the Program Files folder instead. Meanwhile, on Linux Mint (my distro of choice), if AppImage (my to go medium of programs) isn’t working well, I can always fallback to other means, such APT directly or downloading its .deb files then extracting them, getting from flatpak, compiling it myself, building a custom AppImage, running on a VM or emulator, or in the worst possibility, I make a dual boot between Mint and some other distro.

    Also, although there are many package managers, from my experience, they usually work similarly. Some changes in syntax, options and names, but nothing outlandish. It would be, I think, like someone learning a close language to his/her mother tongue. And from experience, you can even organize installations in a more standardized way, although it will take some effort from your part to figure out how, since some adaptations may be needed (java 8 and sdl ptsd intensify).

    And lastly, from what I can observe, stuff in Linux more often than not share logic or even methods with a lot other stuff in the system. Dunno if it’s a bit of a bias of someone that’s using Linux for a few years already, but the fragmentation usually feels superficial to me, with distros being more tweaks of the ones they stem from, and major changes being better observable when distros are sufficiently far apart.