• @yxzi@lemmy.ml
    link
    fedilink
    43 years ago

    yeah sure, it might all seem harmless now, but doesn’t change the fact that it’s a door-opener for Apple

    • @nurkurz@lemmy.ml
      link
      fedilink
      33 years ago

      Aren’t they already doing all of that, but just on their servers? I’d rather have them doing it on my device. There is and will be no way for us to know what they are doing with our data on their servers. But there will be people analysing what ever they do on our devices.

      Same with googles FLoC, do it on my device where I have a chance to turn it off.

      The outcry really should have started when people started using devices that in reality are still owned by the company that made them.

      • Lilium
        link
        fedilink
        73 years ago

        If they are scanning their servers, you can just not send photos to their servers, but if they are scanning your phone or computer, that is a dangerous new line they just crossed.

      • Jedrax
        link
        fedilink
        33 years ago

        Aren’t they already doing all of that, but just on their servers?

        Yes they have. That being said, scanning does not occur if you have iCloud photos turned off. Which I recommend that all iCloud is turned off, such as backups and iMessage, if you really care about privacy.

          • Jedrax
            link
            fedilink
            1
            edit-2
            3 years ago

            Kind of a disingenuous statement. I really care of privacy, but I also care about functionality. So I use an iPhone. Does that mean I don’t care about privacy? No, not at all.

  • Love_Monkey
    link
    fedilink
    43 years ago

    1: How do they train the AI? 2: If my 15 year old daughter sends nudes to her boyfriend, how will the AI know she is 15? Facial recognition? 3: What about parents taking pics of their kids in the bath or pool?

    • @uthredii@lemmy.ml
      link
      fedilink
      1
      edit-2
      3 years ago

      They don’t use an AI. They make a unique string of all the photo’s on your device and compare it to a unique string created from known child abuse images. It will only flag images that are allready in a database and there are safeguards to control what goes into that database.

      This article explains it well: https://www.vox.com/recode/2021/8/10/22617196/apple-ios15-photo-messages-scanned

      Edit: obviously there are still issues with this as there are probably ways for them to scan for other images.