jeffw@lemmy.worldM to News@lemmy.world · 6 months ago“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comexternal-linkmessage-square192fedilinkarrow-up1297arrow-down110
arrow-up1287arrow-down1external-link“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comjeffw@lemmy.worldM to News@lemmy.world · 6 months agomessage-square192fedilink
minus-squareover_clox@lemmy.worldlinkfedilinkarrow-up2arrow-down14·6 months agoAnd conceptually, if I had never seen my cousin in the nude, I’d never know what young people look naked. No that’s not a concept, that’s a fact. AI has seen inappropriate things, and it doesn’t fully know the difference. You can’t blame the AI itself, but you can and should blame any and all users that have knowingly fed it bad data.
And conceptually, if I had never seen my cousin in the nude, I’d never know what young people look naked.
No that’s not a concept, that’s a fact. AI has seen inappropriate things, and it doesn’t fully know the difference.
You can’t blame the AI itself, but you can and should blame any and all users that have knowingly fed it bad data.