Experts can reverse engineer the apps and debug the devices. Thusfar, experts who have done this have found no evidence of these types of activities. All the evidence is anecdotal. I believe if this was a widespread practice, evidence would have been uncovered by now and we would have been reported on widely.
The implication here is really scarier than if they were listening to our conversations. It means they do not need to listen to our conversations. The telemetry they already have is so good that in many cases they know what you will say with such high degrees of accuracy that people assumed that they had to be spying on their conversations.
Either way, we need to demand an end to this unprecedented mass surveillance.
I’ve said this in the past it’s be pretty easy to do a blind or double blind test.
Just have someone choose some random product(s), prompt a LLM to generate conversations about said product(s), run those into a text to speech generator, then put a earbud up to the mic on the phone and play X hours of the generated conversations into the mic. Give the phone back to the owner and see if they can determine what the product(s) were in a set time span.
Good idea! Would love to see the results from anyone testing it.
While I agree Apple isn’t the first company we would expect to do this, it’s a good conversation to have given the opaque legalese in most of the terms and conditions we agree to. Cox Media Group told their advertisers they had this ability, and whether or not that was a lie given the dubious legality it demonstrates the intent is there, and should be guarded against.
If you’ve talked about something, you or your conversation partner will very likely have seen or searched for it or something similar. Which means Google has taken notice and will inform every advertiser on the planet about it.