How will Lemmy handle illegal content like drug dealing, child porn, snuff movies etc? On Reddit, the corporation is accountable so they will make an effort to ensure non of this exists on their platform. And they would face legal troubles if they failed to act.

But as Lemmy is decentralised this isn’t really possible. Sure the main instances can defederate from bad instances, but those instances could still operate and be accessible on the web. Especially if they’re hosted in countries outside of the western sphere of influence. Multiple bad instances could federate and duplicate the illegal content pretty easily, making it difficult for the authorities to keep it shut down. Has this been thought about already?

  • Rottcodd@kbin.social
    link
    fedilink
    arrow-up
    48
    ·
    1 year ago

    “Lemmy” can’t handle anything. That’s by design.

    “Lemmy” is really just a piece of software that people can use to run forums that will federate with other forums and so forth and so on. There is no central “Lemmy” authority that could do anything, and that’s by design, and a lot of the point. It means that there can never be a Lemmy spez or Musk or Zuckerberg, fucking things up for everyone.

    The highest authorities are the individual instance owners, so it will fall on them to deal with illegal content as they see fit. Presumably they’ll generally work to keep it off of their own instances through active moderation, and they’ll block other instances that they have reason to believe do not maintain acceptable standards.

    And like it or not, some share of responsibility will fall on individual users to manage their own activities in order to avoid problematic instances.

    The trade-off for having no central authority that can fuck things up for everyone is that there’s no big mommy/daddy to watch over you and protect you. The fediverse is better suited for people who are okay with that.

  • daniskarma@lemmy.world
    link
    fedilink
    arrow-up
    47
    ·
    1 year ago

    It’s like they host those sites on a apache server.

    Lemmy is just the subjacent software that runs an instance.

    Authorities would track the illegal content the same way they do on any other website I wouldn’t worry too much about it. Also descentralized illegal content exist since P2P protocols exist. I don’t see anything new with lemmy.

    • I Cast Fist@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      It just makes it easier to find. Instance owners/admins are legally bound by the host country’s laws, so the smarter ones know which countries they should host their stuff.

  • kuneho@lemmy.world
    link
    fedilink
    arrow-up
    45
    ·
    1 year ago

    My first guess would be filtering out illegal content is something that the operator/admins of the given instance have to take care according to the law of the country the server is hosted in.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      This becomes a problem though when, like in my case at the moment I’m just me running my own instance. Even if other users join my instance, how many are going to want to moderate the content? Not many I’d bet. There is so much content coming in right now. As of right now I’ve received 405 new threads, 5652 new comments and information like avatars (which could also be explicit) from 6058 users.

      How does one person moderate all of that? I’m not sure (not tested, but I doubt it) whether a moderator deleting a comment or a post on the owning instance is also sent to other instances federating. I think it should (and maybe have it configurable whether you honour them or not, but it should be on by default). That way the moderator team on the hosting instance is removing stuff. Sure on a per instance level reported content can also be deleted. But that should be the outlier.

      • Airazz@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        Reddit mods worked for free, there’s no reason why it would be different here. The tricky part will be picking the competent and fair ones out of all the applicants.

  • RightHandOfIkaros@lemmy.world
    link
    fedilink
    arrow-up
    33
    ·
    edit-2
    1 year ago

    The responsibility would fall on the instance admin(s). Moderators could also share the responsibility, as well as any users involved in the illegal activity.

  • Th4tGuyII@kbin.social
    link
    fedilink
    arrow-up
    33
    ·
    1 year ago

    Each Fediverse instance is just a server running software (I.e. Lemmy, Kbin) that uses the ActivityPub protocol to communicate with other instances.

    There is no central authority, that’s kinda the whole point of being federated and decentralised. Each instance is it’s own website, it’s own island. It’d be like asking how “Email” or “Https” would handle illegal content. It doesn’t, it’s up to the hosts themselves to do so.

    On a fundemental level, each instance is just a website, so an illegal instance would be tracked down and prosecuted in the same way as any other website/forum doing illegal stuff.

    Best you can do is encourage your instance’s admin to defederate from those illegal instances if they haven’t already. Let the authorities handle the rest.

  • ashethursday@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    I’d say moderation is up to each instance, but it does leave the question if anyone is legally responsible for ensuring no instances have illegal content - I’d guess no. If it’s open source then the perpetrators will be solely responsible for illegal content on that instance, as they used the open source platform for nefarious purposes… I’m thinking individual instances have a legal obligation to keep that content blocked, but I’m not sure. I think this is a really good question

  • bdonvr@thelemmy.club
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    Images aren’t federated. You’re viewing them from their host server.

    So it’s not really duplicated anywhere

    • myersguy@lemmy.simpl.website
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I’m not so sure this is the case. The images folder on my single user instance is over 1gb, and I haven’t posted any images.

        • myersguy@lemmy.simpl.website
          link
          fedilink
          arrow-up
          12
          ·
          1 year ago

          I’m going to have to disagree. I just remoted in to check it out. I have 2560x1440 resolution images weighing 2.9 MiB in there. It’s pretty clear that these are full size images.

          • bdonvr@thelemmy.club
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            Well that’s interesting as you’ll never see those images actually served to users - find a remote post where the image URL is your local server. Doesn’t happen.

            • myersguy@lemmy.simpl.website
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              Interesting indeed. I spent some time combing through the DB and pictrs folder to try to figure it out, but I’m at a loss so far.

  • frankPodmore@slrpnk.net
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    With Mastodon, there are blocklists that make it fairly easy for moderators to block instances that post illegal content (or anything else, for that matter). I imagine something similar could be done with Lemmy.

    • balls_expert@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Who will the blocklist affect? mods don’t control what’s being posted on another instance and if there are instances where that’s commonplace they’ll just post there

      Like

      • frankPodmore@slrpnk.net
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I’m not sure I understand your question! Lemmy is FOSS, so anyone can set up an instance, invite users and put stuff on there. But, if everyone else defederates from that instance, it won’t show up anywhere except for that instance. So the illegal content on that hypothetical Lemmy instance isn’t the responsibility of anyone except the owner of that instance. It’s not the fault of the Lemmy designers any more than it’s Microsoft’s fault if I put something illegal on my Windows PC.

  • cloaker@kbin.social
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    Im sure it would be easy to track on behalf of authorities, no more than downloading a large zip and uploading it elsewhere is. Authorities can notice the unique characteristics of files and find them across the web.

    As for actually stopping this, right now a lot of instances federate with many instances, I’d be worried about bad actors fwderating with big instances and then posting illegal shit, meaning big instances download onto their servers by accident. More moderation tools need to be developed into fedi.

  • Drewfro66@lemmygrad.ml
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    There are far easier, more secure ways for people sharing, say, child porn to do so than hosting a Lemmy instance. The only real benefit of a Lemmy instance would be the ease of use.

    A Lemmy instance is just a website, possibly running on someone’s basement server rack. There’s no way for any authority to stop them from hosting child porn on that server, unless that authority is (1). Their ISP or (2). The police.

    Lemmy (or, rather, ActivityPub) is just an internet protocol, like Email. You can’t stop someone from hosting child porn on a Lemmy instance just like you can’t stop someone from sending child porn over email. This is not a reason that Lemmy should not exist in the same way it’s not a reason for secure, encrypted email to not exist. Enforcement falls to traditional, (supposedly) accountable authorities, which is much better than it falling to administrators of a private company.

  • f1g4@feddit.it
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    Illegal content… well it depends. Illegal informations and piracy tricks? Already happening. But beyond that… illegal medias need to be stored and at large somewhere. It’s still better to do illegal shit on WhatsApp or Telegram (as it still happens) because messages, unlike here, are end to end private. I don’t think this is so novel, it isn’t in fact, that authorities or admins don’t know what to do. Also illegal is too generic. Laws vary from country to country as well. I don’t see Lemmy becoming any sort of terrorist hub anytime soon.

      • f1g4@feddit.it
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Stolen data. Credit cards, phone numbers, etc. Illegal procedures (e.g. h0w to buiId a b#mb). Stolen accounts and passwords.

      • b3nsn0w@pricefield.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        ask the warthunder forums, lol

        spoiler

        they repeatedly leaked classified military documentation to win arguments

  • Michael@lemmy.perthchat.org
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Cloudflare offers abuse imagery scanning which seems ideal for Lemmy. If you’re using them as a CDN already it’s something you can simply apply for.

    Instances that don’t want to take any action against such content would likely be prosecuted eventually.

  • cfx_4188@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    arrow-down
    3
    ·
    1 year ago

    The creators of Lemmy keep telling us about the “federation” of the social network. A federation is something that has a federal center that exercises a governing function.

    Just because we all have not been informed of the existence of such a center does not mean that it does not exist.

    • ravheim@kbin.social
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      There was a discussion a few weeks ago where the NSFW instance made it a rule that illegal material would not be removed.
      There was then a clarification that if the material was obviously illegal it would be removed but if it was questionable it would be allowed to stay.
      This makes everyone that allows NSFW material to show up on their feed open to seeing illegal content.
      OP’s question is a valid concern.