Apple’s privacy policies are just PR posturing.
In Round 2 of a fight that started four years ago, the Justice Department and the FBI are pressuring tech giant Apple to create “backdoor” access to its iPhone encryption software. The request comes as the FBI investigation into a shooting at a Pensacola, FL, naval base looks for information on the shooter’s iPhone.
The government requested Apple provide similar assistance after the 2015 San Bernardino shooting. Apple refused, insisting that a backdoor doesn’t exist and claiming that the creation of such would compromise all iPhones, making user data susceptible to hackers. And so far, Apple has held its position four years later.
Some insist that a consequential legal precedent may emerge from this fight between two Goliaths, especially if the government wins. Moreover, Apple could lose face as one of the largest pro-privacy companies in Silicon Valley. Or, perhaps, this recurring fight has received outsized attention and overshadows some other areas in which Apple’s pro-privacy stance is more of a talking point than it is substantive.
To the former point, Apple has already aided the FBI with “all of the data in our possession,” although exactly what data Apple has shared remains unclear. Besides, as seen in the San Bernardino case, the government does not necessarily need Apple’s help with a backdoor. Instead of going through a court appeals process, the FBI paid a private company $1.3 million to hack into the shooter’s iPhone 5c, and the FBI has since used that software to break into similar phone models. The Pensacola shooter owned an iPhone 5 and iPhone 7, two models which a security expert recently told Bloomberg would be incredibly easy to break into.
It seems as if the government’s objective is to garner favorable public opinion, implying that Apple is abetting criminal acts by not cooperating with erroneous demands. But Apple’s pro-privacy position does not adequately protect user data in all instances; their strong stance has not always translated into perfect business practices.
But that says nothing about third-party apps bought on Apple’s app store. As a Washington Post investigation confirmed last year, Apple has little control over how app developers design their products to collect and divulge user data. Developers may have privacy policies that adhere to Apple’s transparency guidelines, which require developers to make known the third party with whom they share data. However, according to the Post, users can’t be sure that those guidelines are even followed, as many apps were revealed to be in violation of their own privacy policies.
But all of these are benign examples of data exploitation when compared to Apple’s operations in China. In early 2018, Apple moved all of its users’ iCloud account data registered in China to mainland Chinese servers. Guizhou-Cloud Big Data, a state-run data management company, now operates Apple’s iCloud servers. Whereas in the United States and elsewhere, users can decide in which countries their data is digitally stored, Chinese citizens are confined to having all their data on in-country servers, to which the government has on-demand access. And while Apple claims its government compliance is for the consumer’s benefit — iCloud would have otherwise shut down in China — it also has, at the very least, acquiesced to an authoritarian regime and sacrificed one of their core values in doing so.
Apple may be one of the more responsible tech companies in Silicon Valley. Facebook remorselessly ignores its own policies, and Google’s Android Play Store is replete with malware-infested apps. But it’s important that buyers not romanticize Apple’s pro-privacy position and confuse PR statements with profit motives and bottom lines.