Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It’s the earliest AI technology striving to expose unreported CSAM at scale.

    • Scratch@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      32
      ·
      22 hours ago

      Not to mention the self image impact such things would have on women with smaller breasts, who (as I understand it) generally already struggle with poor self image due to breast size.

      • sunzu2@thebrainbin.org
        link
        fedilink
        arrow-up
        15
        arrow-down
        1
        ·
        21 hours ago

        Clearly the state gives zero fucks about these women, or anyone else or even “the children”

        Catholic Church is still around for a reason

        • Halosheep@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          13 minutes ago

          Typically the state only cares about things they perceive as children.

      • Clinicallydepressedpoochie@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        ·
        22 hours ago

        If this is the price I must pay, I will pay it, sir! No man should be deprived of privately viewing a consenting adults perfectly formed small tit’s. They can take my liberty, they can take my livelihood, but they will never take away my boner for puffy nipples on a small chested half Japanese woman!