you think apple wouldnt abuse customer data just because of its brand image? thats awfully trusting of a comany which has been proven to scan ‘private’ icloud images. most of their customers either 1) don’t care 2) will believe it’s somehow justified 3) will forget soon enough
the great thing about open source is that people can audit it. and for a big project like android (aosp, grapheneos, etc – as a parallel), people will. any new commits will be analyzed by maintainers. of course its not impossible, but its a lot less likely than anything closed source, where developers are forbidden to disclose any details to the public.
but if youre willing to use siri and icloud despite the privacy concern, that is fine; every solution is a compromise.
Scan private iCloud images? What part of the E2E did you miss? Also, if this is the plan I think you’re talking about for CSAM, they actually abandoned that, even though it was a pretty decent plan…
so because they say that they wont scan your images, you just trust them? the fact that Apple had planned to is evidence enough that they could and possibly do. again, there is no way to prove that they don’t.
do you understand what i’m saying when i say “e2ee is almost meaningless on a closed source app”? you are taking their word on whether they know your private key, or even encrypt your data at all. to encrypt a file properly, use a local opensource program (gpg) before ever letting Apple touch it.
btw, have you heard of the case where a persons picture was flagged as csam, when it was sent to the kids doctor in lockdown? these filters are not perfect, and can ruin someones reputation. any pedophile with even a glint of common sense would avoid proprietry spyware (iCloud) anyway, or at the very least encrypt manually.
again, your privacy is being eroded in the name of “saving the children”.
Everything you’ve said aside from the CSAM scan doctor thing has absolutely nothing to back it up so far. (And for the record, I absolutely agree CSAM scanners can be wrong—a human needs to be involved at some level, which they were in the system Apple devised. At any rate, I guess this convo is over as we obviously inhabit very different worlds.
you think apple wouldnt abuse customer data just because of its brand image? thats awfully trusting of a comany which has been proven to scan ‘private’ icloud images. most of their customers either 1) don’t care 2) will believe it’s somehow justified 3) will forget soon enough
the great thing about open source is that people can audit it. and for a big project like android (aosp, grapheneos, etc – as a parallel), people will. any new commits will be analyzed by maintainers. of course its not impossible, but its a lot less likely than anything closed source, where developers are forbidden to disclose any details to the public.
but if youre willing to use siri and icloud despite the privacy concern, that is fine; every solution is a compromise.
*blink blink*
Scan private iCloud images? What part of the E2E did you miss? Also, if this is the plan I think you’re talking about for CSAM, they actually abandoned that, even though it was a pretty decent plan…
so because they say that they wont scan your images, you just trust them? the fact that Apple had planned to is evidence enough that they could and possibly do. again, there is no way to prove that they don’t.
do you understand what i’m saying when i say “e2ee is almost meaningless on a closed source app”? you are taking their word on whether they know your private key, or even encrypt your data at all. to encrypt a file properly, use a local opensource program (gpg) before ever letting Apple touch it.
btw, have you heard of the case where a persons picture was flagged as csam, when it was sent to the kids doctor in lockdown? these filters are not perfect, and can ruin someones reputation. any pedophile with even a glint of common sense would avoid proprietry spyware (iCloud) anyway, or at the very least encrypt manually.
again, your privacy is being eroded in the name of “saving the children”.
Everything you’ve said aside from the CSAM scan doctor thing has absolutely nothing to back it up so far. (And for the record, I absolutely agree CSAM scanners can be wrong—a human needs to be involved at some level, which they were in the system Apple devised. At any rate, I guess this convo is over as we obviously inhabit very different worlds.
well if youre not willing to accept that Apple does not have your best interest at heart, then I suppose this conversation is over