He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.
As a society we should never allow the normalization of sexualizing children.
Interesting. What do you think about drawn images? Is there a limit to how will the artist can be at drawing/painting? Stick figures vs life like paintings. Interesting line to consider.
If it was photoreal and difficult to distinguish from real photos? Yes, it’s exactly the same.
And even if it’s not photo real, communities that form around drawn child porn are toxic and dangerous as well. Sexualizing children is something I am 100% against.
It feels like driving these people into the dark corners of the internet is worse than allowing them to collect in clearnet spaces where drawn csam is allowed.
I wouldn’t be surprised if it’s a mixture of the two. It’s kind of like if you surround yourself with criminals regularly, you’re more likely to become one yourself. Not to say it’s a 100% given, just more probable.
The far right in France normalized its discourses and they are now at the top of the votes.
Also in France, people talked about pedophilia at the TV in the 70s, 80s and at the beginning of the 90s. It was not just once in a while. It was frequent and open without any trouble. Writers would casually speak about sexual relationships with minors.
The normalization will blur the limits between AI and reality for the worse. It will also make it more popular.
The other point is also that people will always ends with the original. Again, politic is a good example. Conservatives try to mimic the far right to gain votes but at the end people vote for the far right…
And, someone has a daughter. A pedophile takes a picture of her without asking and ask an AI to produce CP based on her. I don’t want to see things like this.
The conventional wisdom used to be, (normal) porn makes people more likely to commit sexual abuse (in general). Then scientists decided to look into that. Slowly, over time, they’ve become more and more convinced that (normal) porn availability in fact reduces sexual assault.
I don’t see an obvious reason why it should be different in case of CP, now that it can be generated.
It should be different because people can not have it. It is disgusting, makes them feel icky and thats just why it has to be bad. Conventional wisdom sometimes really is just convential idiocracy.
The arrest is only a positive. Allowing pedophiles to create AI CP is not a victimless crime. As others point out it muddies the water for CP of real children, but it also potentially would allow pedophiles easier ways to network in the open (if the images are legal they can easily be platformed and advertised), and networking between abusers absolutely emboldens them and results in more abuse.
As a society we should never allow the normalization of sexualizing children.
Interesting. What do you think about drawn images? Is there a limit to how will the artist can be at drawing/painting? Stick figures vs life like paintings. Interesting line to consider.
If it was photoreal and difficult to distinguish from real photos? Yes, it’s exactly the same.
And even if it’s not photo real, communities that form around drawn child porn are toxic and dangerous as well. Sexualizing children is something I am 100% against.
It feels like driving these people into the dark corners of the internet is worse than allowing them to collect in clearnet spaces where drawn csam is allowed.
I’m in favor of specific legislation criminalizing drawn CSAM. It’s definitely less severe than photographic CSAM, and it’s definitely harmful.
Is this proven or a common sense claim you’re making?
I wouldn’t be surprised if it’s a mixture of the two. It’s kind of like if you surround yourself with criminals regularly, you’re more likely to become one yourself. Not to say it’s a 100% given, just more probable.
The far right in France normalized its discourses and they are now at the top of the votes.
Also in France, people talked about pedophilia at the TV in the 70s, 80s and at the beginning of the 90s. It was not just once in a while. It was frequent and open without any trouble. Writers would casually speak about sexual relationships with minors.
The normalization will blur the limits between AI and reality for the worse. It will also make it more popular.
The other point is also that people will always ends with the original. Again, politic is a good example. Conservatives try to mimic the far right to gain votes but at the end people vote for the far right…
And, someone has a daughter. A pedophile takes a picture of her without asking and ask an AI to produce CP based on her. I don’t want to see things like this.
Actually, that’s not quite as clear.
The conventional wisdom used to be, (normal) porn makes people more likely to commit sexual abuse (in general). Then scientists decided to look into that. Slowly, over time, they’ve become more and more convinced that (normal) porn availability in fact reduces sexual assault.
I don’t see an obvious reason why it should be different in case of CP, now that it can be generated.
I wonder if religiosity is correlated.
It should be different because people can not have it. It is disgusting, makes them feel icky and thats just why it has to be bad. Conventional wisdom sometimes really is just convential idiocracy.