• lseif
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    so because they say that they wont scan your images, you just trust them? the fact that Apple had planned to is evidence enough that they could and possibly do. again, there is no way to prove that they don’t.

    do you understand what i’m saying when i say “e2ee is almost meaningless on a closed source app”? you are taking their word on whether they know your private key, or even encrypt your data at all. to encrypt a file properly, use a local opensource program (gpg) before ever letting Apple touch it.

    btw, have you heard of the case where a persons picture was flagged as csam, when it was sent to the kids doctor in lockdown? these filters are not perfect, and can ruin someones reputation. any pedophile with even a glint of common sense would avoid proprietry spyware (iCloud) anyway, or at the very least encrypt manually.

    again, your privacy is being eroded in the name of “saving the children”.

    • heavyboots@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      9 months ago

      Everything you’ve said aside from the CSAM scan doctor thing has absolutely nothing to back it up so far. (And for the record, I absolutely agree CSAM scanners can be wrong—a human needs to be involved at some level, which they were in the system Apple devised. At any rate, I guess this convo is over as we obviously inhabit very different worlds.

      • lseif
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        well if youre not willing to accept that Apple does not have your best interest at heart, then I suppose this conversation is over