What are the risks associated with this? With image uploading capabilities and the like I’m thinking there might be an issue with people posting highly illegal content. I used to run some smaller forums 15 years ago and that went fine, but it feels like the risks are higher today… I’m both thinking about one’s own personal mental health in needing to moderate such content, and also whether it’ll be a legal liability to run an instance if people post illegal content.
I’d love to support the fediverse but this sounds like a huge hassle and a problem. Maybe it’s just me though, I’m glad that there are others that have decided to host instances.
I never thought I’d be a registered CSAM reporter with the feds, but then I decided to host public content via Lemmy. Turns out, while 99.9% of users are great or fine, that 0.1% are just assholes for the sake of being assholes
I think Lemmy/Mbin would benefit from ‘moderation pools’. The basic idea is that, if you subscribe to or join a moderation pool, your instance will automatically copy any moderation action taken on content your instance also hosts. This would allow multiple single-admin instances to moderate even during off-hours of any single admin.
That’s partially what https://fediseer.com/ does.
The same dev also made a CSAM scanning tool based on AI image recognition.
https://github.com/db0/pictrs-safety
This is why I decided not to host an instance in the end. Where I live, the laws are such that the hoster is responsible for the content hosted on their servers So if some shitbag posts CP that gets synced to my server and the authorities somehow find out, it would seriously fuck up my life.
Not only do people avoid creating instances for this reason, but several previously existing instances shut down as a result, like DMV.social.
SO CSAM can be used as tool for suppressing
Obviously, and as always has been. Bullying behaviors *work", or people (& animals) would not bother to expend the effort.
That’s a hellva mentally damaged threat actor but I guess it is effective…
So how we know these are not a state doing it or let’s say social media competitors against each other?
We don’t know who is being it… how would we even? Or someone gets banned, doesn’t appreciate that, and retaliates. They do what they want, we do what we must.
Hmm, this is something I haven’t heard about. Can you actually register as an instance hoster with the FBI or equivalent to say “hey I have a service that may be exposed to CSAM, I do not condone this and will report any cases of it that I see”? If so that could reduce a lot of people’s specific legal fears of hosting.
Not with the FBI, but with the national center for missing and exploited children, who collate reports and work with the FBI. Cloudflare and others have services that route all images through their detection systems and will auto block and report CSAM. I didn’t want to use cloudflare, but turns out if somehow I did accidentally host it, I would be charged with hosting it. I have to report it or I’m the responsible party
That’s good to know. I’ve had some half baked plans to host a public instance for a while (will probably get to it in winter) and honestly the legal risk has been something that’s really held me back. Knowing I have a way to cover my ass for removing it is great.
Unfortunately this isn’t applicable outside of the US in many cases, like in my case.
If you selfhost a single user instance do you still need to register? I get registering if you host a multiuser instance.
If it’s open to public, yes. Even if they don’t have an account if they can still see the offending content then yes.
However, I bet if you use nginx you could somehow block public access and require an account. Something like if not login page and not has a token then block
I’m looking into doing this on my single-user instance. I’ve already modified the code so it doesn’t host images that get federated (it simply links to the URL on the original instance), but it would be good to lock things down a bit tighter.
Now that they added image proxying I feel a lot better about it, but it’s still risky since it gets piped through my server
There are some ways to mitigate the majority of that kind of stuff: You can disable image hosting, defederate from instances with poor moderation or poor attitudes, filter out certain keywords, use cleanup tools like from dbzero. Not sure if the caching still occurs if you disable pictrs hosting tho
Is there a way to choose which instances you want to include, instead of which ones you want to exclude? So the default is that no instances except the ones you explicitly allow, federate with you? Or is this against the spirit of the fediverse?
By default a fresh new instance will federate with no other instances period.
Instances only “learn” about the existence of an outside instance or community after a user enters a community+instance address in the search bar. After that, the home instance will sync with the remote instance and begin getting all new push data from that point on.
I don’t know if Lemmy allows “whitelisting” of synced instances, such that it will auto synchronize with a provided list and ignore all others even if users search for them. I feel like it does, but I am not familiar enough with the backend to say yay or nay.
Yes that is afaik possible via https://fediseer.com/
Yes, it is possible.