Skip to Content, Navigation, or Footer.
Support independent student journalism. Support independent student journalism. Support independent student journalism.
The Dartmouth
April 25, 2024 | Latest Issue
The Dartmouth

Solomon: Heads in the iClouds

Since the dawn of the digital era, the debate over privacy and security has been intense and fiercely controversial. It has morphed into a shouting match filled with abstract details and technical jargon, while the issue has become so politicized and polarized that a middle ground seems impossible to find. The matter has been portrayed as if personal privacy and national security lie on opposite sides of a spectrum, making it seem impossible to care more about one issue without caring less about the other. Recently, however, the debate has lost much of its exclusivity as a solely intellectual or political topic, finally reaching, even if only briefly, the mainstream dialogue.

On Feb. 16, a California judge ordered Apple to unlock an iPhone used in the December San Bernardino shooting. The stakes were high, and Apple had to tread carefully. The terrorist attack in question had left 14 people dead and 22 others injured, becoming a painful memory and a sensitive topic for many Americans who rightfully want justice. Hours after the court order, however, Apple released a letter to its customers, announcing its decision to appeal and explicitly refusing to aid in decrypting the iPhone. Generally, refusing to help the FBI in a terrorist investigation isn’t the best way for a company to better its public image, but the statement, signed by Apple CEO Tim Cook, offered very reasonable set of arguments defending Apple’s decision.

For one, encryption allows iPhone users to protect important personal data, and any tool that Apple would create to breach those protections ultimately cannot be limited to just one iPhone, one type of device or one situation. Yielding to the FBI’s request, Apple argued, would encourage hackers and cybercriminals to mold and modify the decryption tool into a “master key” to access private data and would provide a precedent for the government to breach privacy rights every time they have a hard case to solve.

Ultimately, the FBI received assistance from an outside party it is unwilling to identify and was able to access the San Bernardino phone. That fact alone is worrisome. It is possible that by so adamantly requesting Apple’s help and implying that the only way to get into the phone was to have Apple build a backdoor into it, the FBI provoked an ego-war among hackers and cyber experts to prove that they could bypass the iPhone’s defenses. Had Apple agreed to help, it is also possible that the methods this “outside party” used to hack into the iPhone would have gone undiscovered. Perhaps the key Apple would have created would not have been abused, although we cannot know that with certainty. The consequences of this hypothetical, Apple-created key, if abused, would be even worse than the current situation. Whether Apple’s decision was the safest one is hard to know. What we do know, however, is that they started a conversation we desperately need to have.

Cook’s letter was a reminder of just how much sensitive, personal information we store on an iPhone and on any item of technology. By risking the company’s reputation to safeguard Apple users’ data, Cook stood up for our rights to privacy, even when we as a society do not really care about that privacy enough. We willingly store our personal experiences, contacts, itineraries, financial details and even health data and location information on digital devices that can be easily targeted. In my experience, we rarely ever read the privacy agreements we sign. We don’t blink an eye when we give our addresses and social security numbers to online services. We accept Facebook friend requests from people we don’t recognize. And we seem to put very little thought into our digital privacy.

Considering how remiss we are with our own information, it seems we do not care enough that anyone with a computer and internet access could find out who we are, what we look like, where we live, where we work, and how to contact us — all with just a perfectly legal Google search. We don’t seem too disturbed by the fact that there exist major enterprises that profit off of collecting and selling private data — and we serve them all of our information on a silver platter. Although Cook and Apple nobly stood up for our privacy and persuasively argued that the government has a duty to protect our rights, we have in our negligence failed to give the government any incentive to protect rights we don’t seem to really value.

The debate over privacy is not just rooted in whether the FBI had the right to access the San Bernardino phone, nor is it about whether protecting personal information paves the way for more national security threats. Perhaps if sensitive data were not so easily available, those threats wouldn’t exist in the first place. At the core of each of these contentious cases lies a history of choices, preferences and value judgments that have proven that we aren’t nearly as careful or as concerned as we should be. There are benefits to certain data being public, and there are certainly conveniences to using social media services — but there are also fundamental reasons to protect privacy and reject the normalization of a status quo in which our private lives are, voluntarily and systematically, put on display. It’s up to us to redefine our values, to reprioritize and to finally start being more conscious about our digital vulnerability. The next time you unlock your iPhone, think twice about whether you want your credit cards saved on it, whether you’re okay with sharing your location and whether all the apps and features you find so cool and convenient are truly worth the privacy you are losing.