Hello fellow lemmy users ,

i was wondering whats the best file sharing protocol/app/website , tbh send.vis.ee seems to be currently the best to me but still i wanted your opinion here are things i found

  1. Localsend
  2. ffsend
  3. croc
  4. webtorrent
  5. magic-wormhole
  6. using curl on 0x0.st or pixeldrain
  7. (anonfiles has left so thats sad)
  8. rsync / ssh
  9. onionshare
  10. ipfs

#from what i am hearing , magic-wormhole makes the most sense since they seem to be the most open standard of sharing files but still seems incomplete or the lack of information on such topic makes me feels wierd.

croc seems to have a lot of cve and magic wormhole passed that test from suse’s audit. webtorrent seems to fit in a wierd niche and its implementations like file.pizza arent really that well built ( considering you cant send multi files over there)

i would prefer cli but gui’s as well so that i can send it to somebody else , i would like foss protocol since we can build on that other apps as well , and earlier i used to use shareit which was so bad that literally the govt pulled it because of chinese concerns
currently using localsend but warp (magic-wormhole)/warpinator is also looking good

  • MentalEdge
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    No search engine is going to find a long obfuscated URL. I don’t think NC publishes a site tree for a crawler to use.

    In fact, unless you post your domain somewhere online or its registration is available somewhere, it’s unlikely anyone will ever visit your server without a direct link provided by you or someone else who knows it.

    You might still get discovered by IP crawlers, but even then they aren’t going to be trial and erroring their way to shared files, for the same reason they can’t brute force any sane SSH password.

    • Perhyte@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      In fact, unless you post your domain somewhere online or its registration is available somewhere, it’s unlikely anyone will ever visit your server without a direct link provided by you or someone else who knows it.

      If you use HTTPS with a publicly-trusted certificate (such as via Let’s Encrypt), the host names in the certificate will be published in certificate transparency logs. So at least the “main” domain will be known, as well as any subdomains you don’t hide by using wildcards.

      I’m not sure whether anyone uses those as a list of sites to automatically visit, but I certainly would not count on nobody doing so.

      That just gives them the domain name though, so URLS with long randomly-generated paths should still be safe.

      • MentalEdge
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        11 months ago

        There is also the DNS system itself, not sure if reverse lookup is possible in some way without a PTR record, but suffice to say there are ways, and there are many.

        Obscurity is not security, just a reasonable first line of defense. If you run something publicly accessible, lock it down.

        Stuff that can’t be brute forced in a million years is a good way to do that, even if it’s just a string in a URL. It’s basically like having to enter a password. You could even fail2ban it by banning IPs that try a bunch of random URLs that aren’t valid, or use a simple rate-limit.

    • nixcamic@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      Nah I have some services running on unpublished domains and I get hit by brute force attempts at SSH logins all the time. It might not be sane but botnet gonna botnet.

      • MentalEdge
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        Oh, same. Though on my current IP it hasn’t happened for a couple years, now.

        But finding an SSH port with an IP crawler is a lot easier than finding all the services accessible behind different paths/subdomains on port 80. And even then, mapping out a site-tree all the way out to uncrackable-password-lenght URLs, is never gonna happen by brute force.