The researchers discovered that by looking into how Apple used the local differential privacy (LDP) model, they could determine people’s favorite emoji skin tone and political inclinations. To improve apps and services, businesses gather behavioral data generated by users’ devices on a large scale. However, these records are fine-grained and contain sensitive information about specific people.
LDP allows businesses like Apple and Microsoft to get user data without obtaining personally identifiable information. The current research, however, describes how emoji and website usage patterns obtained through LDP may be used to gather data on a person’s use of emoji skin tones and political affiliations. It was presented at the peer-reviewed USENIX Security Symposium. According to academics from Imperial College London, this goes against the promises made by LDP, and more has to be taken care of in order to safeguard the data of Apple consumers.
Company collects large amounts of data It’s exploitable HOW COULD WE POSSIBLY HAVE SEEN THIS COMING???
All companies collect data and personal data. They should respect privacy legislation (in the EU, the GDPR) and users’ rights. Notably, the processing of personal data should be according to the purposes of the information provided to clients. I think that Apple doesn’t expose to risks simply of misusing personal data.
All companies collect data and personal data. Or they could just not.
It could be possible, but they should respect the GDPR or other legislation.
It may come as a surprise to learn that large corporations break the law at every opportunity. “Don’t put yourself in a position that this will harm you” should not be controversial advice.
I think the prerequisite is to comply with the law. Corporations have to revere the laws like everyone else. It can be considered “normal” for lawyers or consultants to identify pathways to achieve possible goals of a company without violating the legislation. This is legal. Stating that behavior is illegal is up to the judge based on evidence.
Right, but that’s still beside the point. The point is that if they’re collecting obscene amounts of data to begin with, you should expect that they intend to use it for some ill, so you should avoid giving out that data in the first place. (Oh, or that someone will breach it.)