Hello everyone,
We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
We keep working on a solution, we have a few things in the works but that won’t help us now.
Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.
Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.
But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.
Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.
I don’t want to write a long text so here is the short version: These automated tools are not perfect but they don’t have to be. They just have to be good enough to block most of it. The rest can be done through manual labor which also people have done voluntarily on reddit. Reporting needs to get easier and you can prevent spammers from rate limiting them.
To be clear, I don’t have anything against temporarly shutting down a community filled with CP until everything is cleared up. But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.
I’m sorry for the grammatical mistakes. I’m really tired right now and should probably go to bed.
i agree with most of what you’ve written, just one small issue:
You’re probably right that some volunteers handle this content on reddit. By this I mean, mods are volunteers and sometimes mods handle this content.
My point however has been that big social media sites can’t rely on volunteers to handle this content. Reddit, along with facebook and other major sites (but not twitter, as elon just removed this team) has a team of people who pick up the slack where the automated tools leave off. These people are paid, and usually not well, but enough so that it’s their job to remove this content (as opposed to it being a volunteer gig they do on the side). I’ll say that again: these people are paid to look at photographs of CSAM and other psychologically damaging content all day, usually for pennies.
I fully agree with you. It’s just, as a dev, who has toyed around with AI and has been working on code for decades now, I don’t see a clear path forward. I am also not an expert in these tools, so I can’t speak specifically to how well they work. I can only say that they don’t work so well that humans are not required. Ideally, we want tools that work so well humans won’t be required (as it’s a psychologically damaging job), but at the sametime, we don’t want legit users to be misflagged either. The other day there was a link posted to hackerne.ws by a youtube creator who keeps needing to reenable comments on her shorts. The youtube algorithm keeps disabing comments on her shorts because it thinks there’s a child in the video - it’s only ever been her and while she is petite in stature, she’s also 30 years old. She’s been reaching out to youtube for over 3-4 years now and they still haven’t fixed the issue. Each video she uploads she needs to turn on comments manually, which affects her engagement. While nowhere near comparible to the sin of CSAM, it’s also not right for a legit user to be penalized just because of the way she looks - because the algorithm cannot properly determine her age.
Youtube is a good example of how difficult it is to moderate something like this. A while ago, youtube revealed that “a years-worth of content is uploaded every minute” (or maybe it was every hour? still)… Consider how many people would be required to watch every minute of uploaded video, multiplied by each minute in their day. Youtube requires automated tools, and community reporting, and likely also has a team of mods. And it’s still imperfect.
So to be clear, you’re not wrong, it’s just a very difficult problem to solve.