- cross-posted to:
- pulse_of_truth@infosec.pub
*Aliboobie
I am looking forward to this becoming common and effective. Being able to generate animated hentai in assorted styles would be neat. Lina Inverse getting boinked by Donald Duck could be a thing.
Oh my God! That’s disgusting! AI porn online!? Where!? Where do they post those!? There’s so many of them, though. Which one?
Fucking disgusting unethical shit. Tell me where to find it so I can avoid it pls.
pop is funny
fuck you guys, I’m hoarding it all like the filthy perverted bad dragon I am!
Good.
Hot take but I think AI porn will be revolutionary and mostly in a good way. Sex industry is extremely wasteful and inefficient use of our collective time that also often involves a lot of abuse and dark business practices. It’s just somehow taboo to even mention this.
Sometimes you come across a video and you are like ‘oh this. I need more of THIS.’
And then you start tailoring searches to try find more of the same but you keep getting generic or repeated results because the lack of well described/defined content overuse of video TAGs (obviously to try get more views with a large net rather than being specific).
But would i watch AI content if i could feed it a description of what i want? Hell yeah!I mean there are only so many videos of girls giving a blowjob while he eats tacos and watches old Black & White documentaries about the advancements of mechanical production processes.
The year is 1997. A young boy is about to watch a porn video for the first time on a grainy VHS tape. An older version of himself from the far off year of 2025 appears.
Me: “You know, in the future, you’ll make your own porn videos.”
90s me: “Wow, you mean I’ll get to have lots of hot sex!?!?”
Me: “Ha! No. So Nvidia will release this system called CUDA…”
I thought this was going to go Watchmen for a moment. Like…
It is 1997, I am a young boy, I am jerking off to a grainy porno playing over stolen cinemax.
It is 2007, i am in my dorm, i am jerking off to a forum thread full of hi-res porno.
It is 2027, i am jerking off to an ai porno stream that mutates to my desires in real time. I am about to nut so hard that it shatters my perception of time.
Oops the stream hallucinated and mutated into a horror show with women with nipples that are mouths and dicks with eyeballs.
I am about to nut so hard that it shatters my perception of time.
It’s really called Wanx?
It’s time like this that I think, ahhh, the Internet never truly changed :D
First off, I am sex positive, pro porn, pro sex work, and don’t believe sex work should be shameful, and that there is nothing wrong about buying intimacy from a willing seller.
That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.
I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.
i have no problem with ai porn assuming it’s not based on any real identities, i think that should be considered identity theft or impersonation or something.
Outside of that, it’s more complicated, but i don’t think it’s a net negative, people will still thrive in the porn industry, it’s been around since it’s been possible, i don’t see why it wouldn’t continue.
Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?
revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.
To be clear, you’re example is a sketch of johnny depp, i’m talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.
Again you’re talking about distribution
sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.
Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That’s considered identity theft/fraud when we do it with legally identifying papers, it’s a similar case here i think.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.
I’ve been thinking about this recently too, and I have similar feelings.
I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?
More importantly, what should it be?
It probably goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?
If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour, or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour.
And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.
what is the law’s position on AI-generated child porn?
the simplest possible explanation here, is that any porn created based on images of children, is de facto illegal. If it’s trained on adults explicitly, and you prompt it for child porn, that’s a grey area, probably going to follow precedent for drawn art, rather than real content.
Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.
-
Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.
-
Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.
A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.
Good arguments. I think I am convinced that both cases should be illegal.
If the pictures are real they probably increase demand, which is harmful. If the person knew, then the action therefore should result in jail and forced therapy.
If the pictures are not, forced therapy is probably the best option.
So I guess it being illegal in most cases simply to force therapy is the way to go. Even if it in one case is “victimless”. If they don’t manage to plausibly seem rehabilitated by professionals, then jail time for them.
I would assume (but don’t really know) most pedophiles don’t truly want to act on it, and don’t want to have those urges. And would voluntarily go to therapy.
Which is why I am convinced prevention is the way to go. Not sacrificing privacy. In Norway we have anonymous ways for pedophiles to seek help. There have been posters and ads for it a lot of places a year back or something. I have not researched how it works in practice though.
Am I reading this right? You’re for prosecuting people who have broken no laws?
I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?
This sounds like some Minority Report hellscape society.
Correct. This quickly approaches thought crime.
What about an AI gen of a violent rape and murder. Shouldn’t that also be illegal.
But we have movies that have protected that sort of thing for years; graphically. Do those the become illegal after the fact?
And we also have movies of children being victimized so do these likewise become illegal?
We already have studies that show watching violence does not make one violent and while some refuse to accept that, it is well established science.
There is no reason to believe the same isn’t true for watching sexual assault. There are been many many movies that contain such scenes.
But ultimately the issue will become that there is no way to prevent it. The hardware to generate this stuff is already in our pockets. It may not be efficient but it’s possible and efficiency will increase.
The prompts to generate this stuff are easily shared and there is no way to stop that without monitoring all communication and even then I’m sure work around would occur.
Prohibition requires society sacrifice freedoms and we have to decide what weee willing to sacrifice here because as we’ve seen with or prohibitions, once we unleash the law on one, it can be impossible to undo.
-
It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.
The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.
without a victim
It was trained on something.
yeah bro wait until you discover where neural networks got that idea from
It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.
However the act of generating something does not create a new victim unless there is someones likeness and it is shared? Or is there something ethical here, that I am missing?
(Yes, all current AI is basically collective piracy of everyones IP, but besides that)
It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.
the downlow of it is quite simple, if the content is public, available for anybody to consume, and copyright permits it (i don’t see why it shouldn’t in most cases, although if you make porn for money, you probably hold exclusive rights to it, and you probably have a decent position to begin from, though a lengthy uphill battle nonetheless.) there’s not really an argument against that. The biggest problem is identity theft and impersonation, more so than stealing work.
Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.
So take that video and modify it a bit. Color correct or something. That’s still abuse, right?
So the question is, at what point in modifying the video does it become not abuse? When you can’t recognize the person? But I think simply blurring the face wouldn’t suffice. So when?
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?
I can’t make that call. And because I can’t make that call, I can’t support the concept.
I mean, there’s another side to this.
Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.
Is the output a grey area, even if it seems like real rape?
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?
Or is it none of them?
Is the output a grey area, even if it seems like real rape?
on a base semantic and mechanic level, no, not at all. They aren’t real people, there aren’t any victims involved, and there aren’t any perpetrators. You might even be able to argue the opposite, that this is actually a net positive, because it prevents people from consuming real abuse.
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
until you can either publicly display yours, or someone else process of thought, or read peoples minds, definitionally, this is an impossible question to answer. So the default is no, because it’s not possible to be based in any frame of reality.
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
assuming it depicts no real persons or identities, no, there is nothing necessarily wrong about this, in fact i would defer back to the first answer for this one.
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
this is the same as the previous question, media format makes no difference, it’s telling the same story.
When does the above cross into a problem?
most people would argue, and i think precedent would probably agree, that this would start to be a problem when explicit external influences are a part of the motivation, rather than an explicitly internally motivated process. There is necessarily a morality line that must be crossed to become a more negative thing, than it is a positive thing. The question is how to define that line in regards to AI.
We already allow simulated rape in tv and movies. AI simply allows a more graphical portrayal.
It’s not just AI that can create content like that though. 3d artists have been making victimless rape slop of your vidya waifu for well over a decade now.
I see the issue with how much of a crime is enough for it to be okay, and the gray area. I can’t make that call either, but I kinda disagree with the black and white conclusion. I don’t need something to be perfectly ethical, few things are. I do however want to act in a ethical manner, and strive to be better.
Where do you draw the line? It sounds like you mean no AI can be used in any cases, unless all the material has been carefully vetted?
I highly doubt there isn’t illegal content in most AI models of any size by big tech.
I am not sure where I draw the line, but I do want to use AI services, but not for porn though.
Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.
is this a legal thing? I’m not familiar with the laws surrounding sexual abuse, on account of the fact that i don’t frequently sexually abuse people, but if this is an established legal precedent that’s definitely a good argument to use.
However, on a mechanical level. A recounting of an instance isn’t necessarily a 1:1 retelling of that instance. A video of rape for example, isn’t abuse anymore so than the act of rape within it, and of course the nonconsensual recording and sharing of it (because it’s rape) distribution of that could necessarily be considered a crime of it’s own, same with possession, however interacting with the video i’m not sure is necessarily abuse in it’s own right, based on semantics. The video most certainly contains abuse, the watcher of the video may or may not like that, i’m not sure whether or that should influence that, because that’s an external value. Something like “X person thought about raping Y person, and got off to it” would also be abuse under the same pretense at a certain point. There is certainly some interesting nuance here.
If i watch someone murder someone else, at what point do i become an accomplice to murder, rather than an additional victim in the chain. That’s the sort of litmus test this is going to require.
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere).
to be clear, this would be a statistically minimal amount of abuse, the vast majority of adult content is going to be legally produced and sanctioned, made public by the creators of those videos for the purposes of generating revenue. I guess the real question here, is what percent of X is still considered to be “original” enough to count as the same thing.
Like we’re talking probably less than 1% of all public porn, but a significant margin, is non consensual (we will use this as the base) and the AI is trained on this set, to produce a minimally alike, or entirely distinct image from the feature set provided. So you could theoretically create a formula to determine how far removed you are from the original content in 1% of cases. I would imagine this is going to be a lot closer to 0 than it is to any significant number, unless you start including external factors, like intentionally deepfaking someone into it for example. That would be my primary concern.
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?
another important concept here is human behavior as it’s conceptually similar in concept to the AI in question, there are clear strict laws regarding most of these things in real life, but we aren’t talking about real life. What if i had someone in my family, who got raped at some point in their life, and this has happened to several other members of my family, or friends of mine, and i decide to write a book, loosely based on the experiences of these individuals (this isn’t necessarily going to be based on those instances for example, however it will most certainly be influenced by them)
There’s a hugely complex hugely messy set of questions, and answers that need to be given about this. A lot of people are operating on a set of principles much too simple to be able to make any conclusive judgement about this sort of thing. Which is why this kind of discussion is ultimately important.
This is the way.
Pics or it didn’t happen.
I feel like this should be a real tattoo….
😂
For any skeezy technology, ask:
-
Does it have legitimate uses?
-
Would stopping it cause more problems than it solves?
People can fret about pornogrifying real people, but Photoshop already had this debate, and Photoshop kinda has to use real photos of real people. This software defaults to making shit up. Its output, however vulgar, doesn’t need to look like any actual person. Even depicting acts that are extremely illegal is just a machine fantasy matching demonstrable labels. The only possible way to prevent that from including bad things would be to destroy this technology in its entirety, and that simply will not happen.
So stop fretting. You don’t need a perfectly virtuous take on a do-anything program that can emit smut. There will be no version that somehow recognizes your coworkers and refuses to work with that. Nor will Photoshop treat a celebrity’s copy-pasted face like a Eurion constellation. A network that can show you a spinning hot dog on the moon can obviously also show tits, because there’s more examples of mundane tits than rotating moon dogs.
Empowering people to do science fiction shit with their computers includes bad things. That’s what most science fiction is about! We have to deal with that, the same way we have to deal with people being capable of bad things using e-mail and motor vehicles and pointy sticks. You make a reasonable effort to stop people doing bad things accidentally - and if they do it anyway, you hit them.
This stupid gimmick is how we let everyone become a movie maker.
Whatever it’s good at, with trivial input, we’ll all recognize and become inured to. Like how Terragan was jawdropping until you saw the dozenth barren landscape. People will use this tech to tell stories, because humans can’t help using things to tell stories. We’ll get shows starring actors who don’t exist, and judge them viciously among a sea of viewing options, even knowing the whole thing was produced by A Guy. That solitary authorship will allow things that weren’t possible when they needed a budget and a crew. That will include bad things. But I’m less worried about seeing my actual face plastered onto generic gross hallucinations, than I am about missing a reality where everyone can make their dream project real.
-
I am going to make this statement openly on the Internet. Feel free to make AI generated porn of me as long as it involves adults. Nobody is going to believe that a video of me getting railed by a pink wolf furry is real. Everyone knows I’m not that lucky.
Fortunately, most of my family is so tech illiterate that even if a real video got out, I could just tell them it’s a deepfake and they’d probably believe me.
Does the wolf need to be an adult in human years or dog years?
i think that it should probably be capable of consent, that would be my guess.
Asking the important questions…
I understand the convention is that the wolf is really a thousand-year-old polymorphed dragon, regardless of physical appearance.
Well that is a plot twist that I was somehow not expecting from an AI porn video.
Drag is finally vindicated?
Yeah after certain point this shit doesn’t really do anything… But younger people and mentally vulnerable people gonna be rocked by this tech lol
I’d like to point out the program is named WanX…wanx…wanks…
Who are the girls in the picture? We can do this, team. Left to right, starting at the top.
-
??
-
??
-
bayonet
-
little mermaid
-
??
-
??
-
Jinx
-
??
-
Rei
-
Rei
-
lol Rei
-
Aerith
To which I have to say… good on them for using AI porn in the least bad way? (IE realistic fictional characters instead of real people that did not consent to the depictions being made of them).
That’s Rem, not Rei
Who’s Rem?
If you have to ask, you can’t afford it!
That was so high level it went past most of them. Take my upvote
Left of rei. Likely her twin sister Ram is to the left of her, but iirc they have different color hairbands so it’s probably just two copies of Rem, so like [Rem, Rem, Rei]
Character from an anime called Re:Zero
Edit: I guess I got whooshed? I haven’t seen the show yet, just know what some characters look like.
#1 is markiplier
7 looks like Jinx from Arcane
Added
It feels good to contribute to society
Not to be that guy but… go out and touch some grass.
If you don’t want to be that guy, then stop being that guy.
You shouldn’t judge someone for having a grass fetish. They have wants and needs like anyone else.
When the lawn gets cut, which activates its distress signal with that sweet fresh fragrance… OP can’t help but get off on that.
I’ll never look at an old man with grass-stained, white New Balances ever again.
Because God forbid anyone have fun, right? You need to take your own advice.
You must be fun because you love pornstars! I don’t so I’m boring! I’m going for some fun grass now. Have a good one.
Ah yes, the famous pornstars “little mermaid” and “aerith”
deleted by creator
Sounds like you need to touch yourself more, actually. It’s okay. Everyone does it.
I usually don’t relay in pornstars, and if I do, I don’t repeat or remember their face. But I guess there is a big market of pro-fappers in Lemmy.
They’re videogame and movie characters, Einstein.
Based on their response habits, it’s likely a poorly made AI or a 13 year old kid. Not worth interacting with it cuz either: it is incable or caring, or they really aren’t supposed to be here and we really shouldn’t welcome children into adult spaces by allowing them into the conversation. If they don’t wanna have discussion, then why would they contribute to conversation when spoken to?
The advice was “touch” not “smoke” but at least you were out for some time.
Removed by mod
Oh, I’m sorry, I didn’t know. That perfectly proves my point that some people need to touch some grass, but anyway, I’m not here to judge anyone’s life, it was just a little piece of advice. That’s all for me. Have a good one.
Let me guess, your parents don’t let you watch movies or play video games so you have anger issues towards adults that have done so their entire lives. That’s too bad.
Oh, you wanna talk about me? My parents bought a chipped PS1 when I was a kid and love it. Now I have a PS5 (used to play AC, GTA, and FIFA, but I got bored, so it became a “youtube frontend”). I had a cable TV and a PC with Windows 95/98/XP/Vista (acording time) and later Ubuntu. Great childhood, but I also had friends, used to play football every single week day (the one with the foot in the ball). Now I only play once a week in my local team (But I do it really bad, so I play defense). Anything more you wanna know? I’m all for sharing! =) I know, i know, too much about me. Have a great day. And next time just ignore or take the advice. If you feel I’m trolling, just don’t feed me. Bye!!
-
Nice
Ok I checked it out, why are there so many people with boobs and with 3 foot cocks? Talk about unrealistic body expectations.
I have a 3 foot cock. Well as long as they’re mouse feet.
I hate how mainstream media is always underrepresenting the size of the typical cock.
Well, genetic engineering is advancing at a good clip.