• over_clox@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    12
    ·
    6 months ago

    The only example I can think of with what you said is just a couple brief innocent scenes from The Blue Lagoon.

    Short of that, I don’t know (nor care for any references to) any other legal public images or video of anything as such.

    I dunno, I’m just bumfuzzled how AI, whether public or private, could have sufficient information to generate such things these days.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      6 months ago

      Do a Google Image search for “child” or “teenager” or other such innocent terms, you’ll find plenty of such.

      I think you’re underestimating just how well AI is able to learn basic concepts from images. A lot of people imagine these AIs as being some sort of collage machine that pastes together little chunks of existing images, but that’s not what’s going on under the hood of modern generative art AIs. They learn the underlying concepts and characteristics of what things are, and are able to remix them conceptually.

      • over_clox@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        14
        ·
        6 months ago

        And conceptually, if I had never seen my cousin in the nude, I’d never know what young people look naked.

        No that’s not a concept, that’s a fact. AI has seen inappropriate things, and it doesn’t fully know the difference.

        You can’t blame the AI itself, but you can and should blame any and all users that have knowingly fed it bad data.