A reported Free Download Manager supply chain attack redirected Linux users to a malicious Debian package repository that installed information-stealing malware.

The malware used in this campaign establishes a reverse shell to a C2 server and installs a Bash stealer that collects user data and account credentials.

Kaspersky discovered the potential supply chain compromise case while investigating suspicious domains, finding that the campaign has been underway for over three years.

  • TrustingZebra@lemmy.one
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    10 months ago

    It’s still my favorite download manager on Windows. It often downloads file significantly faster than the download manager built into browsers. Luckily I never installed it on Linux, since I have a habit of only installing from package managers.

    Do you know of a good download manager for Linux?

    • FredericChopin_@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      ·
      10 months ago

      How much faster are we talking?

      I’ve honestly never looked at my downloads and though huh you should be quicker, well maybe in 90’s.

      • TrustingZebra@lemmy.one
        link
        fedilink
        arrow-up
        7
        ·
        10 months ago

        FDM does some clever things to boost download speeds. It splits up a download into different chuncks, and somehow downloads them concurrently. It makes a big difference for large files (for example, Linux ISOs).

        • FredericChopin_@feddit.uk
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          10 months ago

          Im curious as to how it would achieve that?

          It can’t split a file before it has the file. And all downloads are split up. They’re called packets.

          Not saying it doesn’t do it, just wondering how.

          • everett@lemmy.ml
            link
            fedilink
            arrow-up
            13
            ·
            10 months ago

            It could make multiple requests to the server, asking each request to resume starting at a certain byte.

              • drspod@lemmy.mlOP
                link
                fedilink
                arrow-up
                18
                ·
                10 months ago

                The key thing to know is that a client can do an HTTP HEAD request to get just the Content-Length of the file, and then perform GET requests with the Range request header to fetch a specific chunk of a file.

                This mechanism was introduced in HTTP 1.1 (byte-serving).

      • arglebargle@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        10 months ago

        just grabbed a gig file - it would take about 8 minutes with a standard download in Firefox. Use a manager or axel and it will be 30 seconds. Then again speed isnt everything, its also nice to be able to have auto retry and completion.

    • Xirup@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      JDownloader, XDM, FileCentipede (this one is the closest to IDM, although it uses closed source libraries), kGet, etc.

    • arglebargle@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      axel. use axel -n8 to make 8 connections/segments which it will assemble when it is done