shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

    • Lionir [he/him]
      link
      fedilink
      English
      4910 months ago

      Everybody gets horny, idiot.

      Please don’t call people idiots needlessly.

      Does it matter if someone jerks off to JaLo in the Fappening or some random AI generated BS?

      The issue is that this technology can be used to create pornographic material of anyone that has some level of realism without their consent. For creators and the average person, this is incredibly harmful. I don’t want porn of myself to be made and neither do a lot of creators online.

      Not only are these images an affront to the dignity of people but it can also be incredibly harmful for someone to see porn of themselves they did not make with someone else’s body.

      This is a matter of human decency and consent. It is not negotiable.

      As mentioned by @ram@lemmy.ca, this can also be used for other harmful things like CSAM which is genuinely terrifying.

      • @TheFriendlyArtificer@beehaw.org
        link
        fedilink
        2710 months ago

        I have to disagree (but won’t downvote!)

        AI porn is creepy. In multiple ways!

        But it’s also a natural evolution of what we’ve been doing as a species since before we were a species.

        Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.

        How about if somebody draws a crude stick figure of somebody they met on the street? Unless you’re Randall Munroe, this is probably harmless too.

        Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.

        Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.

        Or, and this happened to me quite recently, you find your porn doppelganger. My spouse found mine and it ruined her alone time. And they really did look just like me! Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?

        Like you, I don’t want people like this in my life. But it feels like this is one of those slippery slopes that turns out to be an actual slippery slope.

        You can’t make it illegal without some serious downstream effects.

        If you did, the servers will just get hosted in an Eastern European country that is happy to lulwat at American warrants.

        I don’t have any answers, just more Devil’s advocate-esque questions. If there was a way to make it illegal without any collateral damage, I’d be proudly behind you leading the charge. I just can’t imagine a situation where it wouldn’t get abused, a’la the DMCA.

        • Lionir [he/him]
          link
          fedilink
          English
          210 months ago

          Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.

          You can’t share that though so while I still think it is immoral, it is also kind of impossible to know.

          Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.

          Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.

          Those would be immoral and reprehensible. The law already protects against such cases on the basis of using someone’s likeness.

          It’s harmful because it shares images of someone doing things they would never do. It’s not caricature, it’s simply a fabrication. It doesn’t provide criticism - it is simply erotic.

          Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?

          If the goal is to look like you, I would imagine it is possible to defend by law. Otherwise, it is simply coincidence. There’s no intent there.

          I don’t think it is a stretch or slippery slope. Just as a picture is captured by a camera, a drawing is captured by a person or a machine.

          Both should be the same and it is often already the case in many jurisdictions around the world when it comes to CSAM.

          • Rekorse
            link
            fedilink
            110 months ago

            All of your arguments assume profit is the motive. Are you saying as long as no profit is made that it would be okay to do all of these things? (Ex. Self use only)

            • Lionir [he/him]
              link
              fedilink
              English
              110 months ago

              No. I think that it would still be bad if it were self-use because it is ultimately doing something that someone doesn’t consent to.

              If you were to use this on yourself or someone consenting, I see no issues there - be kinky all you want.

              Consent is the core foundation for me.

              The reason why imagining someone is different is that it is often less intentional - thoughts are not actions.

              Drawing someone to be similar to someone you know is very intentional. Even worse, there is a high likely chance that if you are drawing someone you know naked, you likely never asked for their consent because you know you wouldn’t get it.

        • Lionir [he/him]
          link
          fedilink
          English
          110 months ago

          How is ai pedophile stuff worse than actual pedophile stuff?

          It’s not worse - it’s just as bad.

            • Rekorse
              link
              fedilink
              210 months ago

              That person just can’t grapple with any nuance, as they are afraid to let the sentence “ai child porn is less bad” cone out of their mouths

            • Lionir [he/him]
              link
              fedilink
              English
              110 months ago

              I don’t like grading evil for this very reason so I think I will refrain from doing so - thank you for catching me doing that. I will refrain from doing that.

              That said, AI CSAM could enable other forms of abuse through blackmail. I can also see very harmful things happening to a child or teenager because people may share this material in a targeted way.

              I think both are inhumane and disgusting.

                • Lionir [he/him]
                  link
                  fedilink
                  English
                  110 months ago

                  I mean maybe calling it evil is part of the problem ?

                  I call it evil because it is intentional and premeditated.

                  There are degrees in everything. Punching somebody is less bad than killing somebody.

                  Trying to put everything on degrees is bound to show ignorance and imply that certain things are more acceptable than others.

                  I don’t want to hurt people with my ignorance and I do not want to tell someone that what they experienced is less bad than something else. They are bad and we’ll leave it at that.

                  Btw its totally humane because we invented the shit.

                  I am working with this definition : “Characterized by kindness, mercy, or compassion”. There is a difference between human-made and humane.

    • Jordan Lund
      link
      fedilink
      English
      2610 months ago

      You say that NOW, but if people start using your images to generate revenge porn or, you know, really anything you didn’t consent to, that’s a huge problem.

      Both for the people whose images were used to train the model and for the people whose images are generated using the models.

      Non-consent is non-consent.

      This is how you get the feds involved.

      • ram
        link
        fedilink
        English
        3610 months ago

        Let’s not forget that these AI aren’t limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.

        • @PelicanPersuader@beehaw.org
          link
          fedilink
          English
          1410 months ago

          Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn’t happen, meaning those resource won’t be used to save real children in actual danger.

        • MaggiWuerze
          link
          fedilink
          110 months ago

          On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.

          • ram
            link
            fedilink
            English
            410 months ago

            Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.

            • @ichbinjasokreativ@beehaw.org
              link
              fedilink
              110 months ago

              It’s (rightfully) currently illegal, but that doesn’t stop people. Keep it illegal, increase punishment drastically, make AI-created material a grey area.

              • Rekorse
                link
                fedilink
                210 months ago

                Its already the worst crime around and people still do it. Maybe its not the punishment we need to focus on.

              • ram
                link
                fedilink
                English
                1
                edit-2
                10 months ago

                I’m not sure increasing punishment is actually an effective manner of combating this. The social implications of being a child predator are likely to have a more deterrent effect than the penal system imo (I don’t have data to back that).

                I, personally, am an advocate for making treatment for pedophiles freely, easily, and safely accessible. I’d much rather help people be productive, non-violent members of society than lock them up, if given a choice.

            • MaggiWuerze
              link
              fedilink
              110 months ago

              Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?

              • ram
                link
                fedilink
                English
                110 months ago

                Neither. I would have mental health supports that are accessible to them.

                • @tweeks@feddit.nl
                  link
                  fedilink
                  210 months ago

                  Of course we don’t want both, but it comes across as if you’re dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.

                  Mental health support is available and real CSAM is still being generated. I’d suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.

          • @tweeks@feddit.nl
            link
            fedilink
            110 months ago

            That’s a fair point. And I believe AI should be able to combine legal material to create illegal material. Although this still feels wrong, if it excludes suffering in base material and reduces future (child) suffering, I’d say we should do research on it at least. Even if it’s controversial, we need to look at the rationale behind it.

      • @Evergreen5970@beehaw.org
        link
        fedilink
        English
        1010 months ago

        As someone who personally wouldn’t care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn’t like me may have the option to generate AI porn of me having sex with a child. Now there’s fake “proof” I’m a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I’m vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say “Evergreen5970 is promiscuous, don’t hire them.” Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I’m a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn’t do.

        Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.

        And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn’t detect AI-generated images with a perfect accuracy rate. So the question becomes “how can we trust any image anymore?” Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there’ll probably always be some floating around with those guardrails turned off.

        I’m also very wary of dismissing other peoples’ discomfort just because I don’t share it. I’m still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.