• 1 Post
  • 82 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle

  • AI isn’t stealing jobs any more than cheap clip art has. The only people who would resort to AI for illustration stopped hiring actual artists ages ago, they buy from shutterstock and the like instead.

    The reason artists are pissed id because they used our art to train the AI without our permission. And no, its not the same thing as an artist learning from others, first because of the scale, and second because a student who is learning from other artists isn’t looking to copy an existing style, they’re learning and developing their own. AI just regurgitates what it already knows and attempts to imitate the style of an actual person. It was developed specifically to do that.

    If a human being copies the style of another artist rather than develop their own, they’ll be called out too. No one has ever been okay with that, ever.



  • I run a mastodon instance with some friends, and a single user misskey instance for myself because I like it better than mastodon. It’s its own thing, it’s been around since 2014 and it’s very robust and stable. I love being able to format posts, easily customize the instance and write more than 500 characters.

    I’m personally not a huge fan of firefish but it also works pretty well and has many of the same features as misskey.





  • What you’re missing is the fact that it would be a huge risk and we have mouths to feed and bills to pay. Who’s to say the other people in the mutual aid group will stick around? Without a job I have no health insurance, what happens if I get cancer? Who pays for that? How do I know that I’ll be able to pay rent every month? With whose money? What happens if my dog needs surgery, who pays for that? Where do I get money for groceries?

    Also as one of the working poor I have never relied on charity for anything because I live in a shit hole that doesn’t really even offer anything. Only churches do charity work here and my gay ass is not welcome there.

    That’s not even taking into account the fact that you’d have to get people to agree to help each other. Do you think all the racist white assholes in my town will join up with black and Mexican families? lmao


  • Bob@lemmy.worldtoFediverse@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    1 year ago

    What do you mean I don’t host it like mastodon would? Are you talking about the “official” mastodon instance, mastodon.social? Because they do exactly what I described above, and you’d have known if you’d just checked https://mastodon.social/about before posting. Just click “moderated servers”. See all the “inappropriate content” that’s listed there? Most of those allow lolicon/shotacon (as evidenced by their defederation of pawoo). It is standard practice, and whether or not you agree is completely irrelevant. That’s how most instances are run, period, except for the free speech absolutists who are also defederated by everyone else for allowing bigotry/propaganda/lolicon and so on.

    I know that there’s a tradition of confidently posting on reddit about something you don’t know anything about and acting like your opinion on the subject you know nothing about is Very Important, but this ain’t reddit.


  • I can’t tell if you’re trying to be funny or not but I’ll answer anyway.

    There’s a difference between federating with instances that disallow any pornography featuring models/characters/etc who look underage, and federating with instances that allow that type of material. Actual CSAM will be immediately obvious and handled very quickly on the first, but not necessarily on the latter instance.

    It’s pretty much standard practice for Mastodon/*key/whatever admins to defederate instances that allow lolicon/shotacon and anything else like that. There are curated block lists out there and everything, we’ve been doing it for years while still federating and we’re doing just fine.


  • The problem is that if it’s hard to tell at a glance, there’s no way to know if actual CSAM gets uploaded there in the future. So what it boils down to is, is it worth the risk? That admin says no, it isn’t, so they defederate.

    My Mastodon instance defederates pretty much any instance that allows sexually explicit or suggestive artwork or photos of people who look underage. It’s just not worth it.




  • There aren’t nearly as many right wingers and fascists as social media makes you believe. Speaking as a Mastodon instance admin, every single time we’ve had huge waves of bot and/or troll signups they’ve been very clearly right wing accounts (almost all had similar bios) that almost immediately started interacting with and boosting each other as well as harassing trans and queer people.

    The thing about the fediverse is that it can’t be manipulated the way centralized social media can. So what happens is that it gets handled very quickly. They get banned and their instances get defederated so all they can do is shout into the void. They’re not, nor have they ever been, the majority by any stretch of the imagination, and most people have absolutely no desire to hear what they have to say at all.

    On top of that, a huge number of them are grifters, and they won’t get any engagement here. They can’t get the kind of viral outrage they need because most people aren’t even seeing their posts.


  • Bob@lemmy.worldtoFediverse@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    6
    ·
    edit-2
    1 year ago

    First of all I want to make it clear that I don’t agree with this defederation, if the models are verified adults then there is no problem.

    That said, as a Mastodon instance admin, I wanna explain something to y’all. CSAM is one of those things that you do not want to take your chances with as an admin. Beyond the obvious fact that it’s vile, even having that shit cached on your server can potentially lead to very serious legal trouble. I can see how an admin might choose to defederate because even if right now all models are verified, what if something slips through the cracks (pun not intended, but I’ll roll with it).

    My instance defederates a bunch of Japanese artist instances like pawoo because of this. All it takes is one user crossing the line, one AI generated image that looks too real.

    Aside from all that, there’s also a lot of pressure being put on many instance admins to outright ban users and defederate instances that post or allow loli/shota artwork as well. You’re quickly labeled a pedophile if you don’t do it. A lot of people consider fake CSAM to be just as bad, so it’s possible that the other admin felt that way.

    I’m more lenient on loli/shota as long as it’s not realistic because I understand that it’s a cultural difference and generally speaking Japanese people don’t see it the way we do. I don’t ban stuff just because I think it’s gross, I just don’t look at it.

    Anyway what I’m trying to say I guess is that being an admin is hard and there’s a lot of stuff y’all don’t know about so disagree with that person if you want (I do too) but keep in mind that these decisions don’t come easy and nobody likes to defederate.

    EDIT: here’s a mastodon thread about the CSAM problem in the fediverse if you’d like to learn more.




  • Bob@lemmy.worldtoPolitics@lemmy.mlThis is facts.
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    This isn’t new. They’ve always wanted this. The only new thing is that now they feel emboldened to actually do it.

    And every single person who feels safe right now because this isn’t affecting them is in for a very rude awakening.

    I live in Texas. I know these people. They think American = white, cis, straight, conservative. They will stop at nothing. They don’t care who gets hurt.