A nightmare scenario previously only imagined by AI researchers, where AI image generators accidentally spit out non-consensual pornography of real people, is now reality.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    1 year ago

    “Knowing what Jennifer Lopez looks like” is a very distinct thing from “reproducing an exact replica” of training data. OP appears to be arguing that the former is not true because he thinks the latter is true, but it’s actually the opposite. That’s the crux of what I’m arguing here, OP is simply factually wrong about his position.

    Edit: OP has pointed out that he doesn’t actually think there are exact replicas being produced, which just makes this even more confusing.

    • bioemerl@kbin.social
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      1 year ago

      OP has pointed out that he doesn’t actually think there are exact replicas being produced, which just makes this even more confusing.

      Your misread their first comment, I think.

      They were saying that DESPITE the common arguments that AI only learns and doesn’t copy exactly it might still be good to require consent for people’s content to be in training data.