misk to Technology@lemmy.worldEnglish · 1 year agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square81fedilinkarrow-up1481arrow-down117
arrow-up1464arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk to Technology@lemmy.worldEnglish · 1 year agomessage-square81fedilink
minus-squareThe Barto@sh.itjust.workslinkfedilinkEnglisharrow-up3·edit-21 year agoLegitimate reason? No, but there’s always a reason to know how to make napalm.
Legitimate reason? No, but there’s always a reason to know how to make napalm.