misk to Technology@lemmy.worldEnglish · 1 year agoAI image training dataset found to include child sexual abuse imagerywww.theverge.comexternal-linkmessage-square14fedilinkarrow-up1127arrow-down113
arrow-up1114arrow-down1external-linkAI image training dataset found to include child sexual abuse imagerywww.theverge.commisk to Technology@lemmy.worldEnglish · 1 year agomessage-square14fedilink
minus-squaresir_reginald@lemmy.worldlinkfedilinkEnglisharrow-up7·edit-21 year agoremoving these images from the open web has been a headache of webmasters and admins for years in sites which host user uploaded images. if the millions of images in the training data were automatically scraped from the internet, I don’t find it surprising that there was CSAM there.
minus-squareCommunist@lemmy.mllinkfedilinkEnglisharrow-up1arrow-down1·1 year agoDon’t they need to label the data?
removing these images from the open web has been a headache of webmasters and admins for years in sites which host user uploaded images.
if the millions of images in the training data were automatically scraped from the internet, I don’t find it surprising that there was CSAM there.
Don’t they need to label the data?
Not manually