Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • Mango@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    2 months ago

    No I don’t. There’s no difference. Are you trying to say that talent gives you a free pass where otherwise they shouldn’t? Fuck that. The speed is meaningless. The realism is meaningless. The brush you paint with doesn’t change the ethics even a little bit.

    • otp@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      It’s not about the speed in isolation. The speed is what allows for the quantity to be much greater.

      Just like breaking into one car over night is bad, but breaking into 100,000 cars over one night is a problem of a much greater scope.

      • Mango@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        So your point is that because he’s fast with this tool, it’s bad? Guess we gotta institute fake CP data rate limits.

        • otp@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          A tool that allows anyone to generate countless images of CSAM in minutes (based on real images as input) is definitely worse than someone needing to spend years honing an art and using hours to produce one image of CSAM. I’m not really sure how someone could argue against that.