• webghost0101
    link
    fedilink
    arrow-up
    8
    ·
    10 months ago

    They did literally nothing and seem to use the default stable diffusion model which is supposed to be a techdemo. Would have been easy to put “(((nude, nudity, naked, sexual, violence, gore)))” as the negative prompt

    • megopie@beehaw.org
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      The problem is that negative prompts can help, but when the training data is so heavily poisoned in one direction, stuff gets through.