The December 2015 San Bernardino terrorist attack, which claimed the lives of 14 people, sparked furious debate about gun violence and gun control, radicalism, and the asylum-granting process for refugees. Now, some three months later, it has sparked a new debate at the intersection of encryption, privacy, and governmental access. This is a conversation the nation desperately needs to have, and it will likely shape the future of the relationship between the US government and the tech industry as a whole.
And yet, it all started with one man’s letter and another’s iPhone.
Tim Cook, the current CEO of Apple, penned the letter in question. And Syed Rizwan Farook, one of the perpetrators of the San Bernardino terrorist attack, was the owner of the iPhone 5C in question. Apple’s relevance in the episode has sprung from the FBI’s investigation into Farook’s atrocities. In early February, that relevance was solidified, as the FBI was granted a court order by a California federal court allowing the bureau to pressure Apple into unlocking Farook’s phone. Apple, standing as a proxy for the tech industry at large, condemns this order as a formal codification of the relationship between the government and the tech industry. To Apple and other tech companies, the relationship as written is inherently and fundamentally broken.
Apple has since decided to take a stand, directing attention towards the balance between private encryption and national security. Cook’s open letter has further thrust this discourse into the court of public opinion, a move designed to motivate and include those it will affect most: everyday US citizens. Apple thus far has protested on ideological grounds, as most in the technology community concur that Apple could produce the software requested by the FBI. By officially refusing governmental summons Apple has initiated a key dialogue between the tech industry and the US government at large. Yet ironically, by coloring the issue as one of diametrically opposed ideologies, Apple has closed the book on that same dialogue. To appropriately address the dialogue surrounding encryption and third party access, both the tech industry and the US government will need to move away from absolute rhetorical rigidity.
Ultimately, the logic saturating the cyber security industry is the same as the logic that pushes individuals to purchase home security systems, car alarms, or personal safes. More often than not, an ounce of prevention really is worth a pound of cure. Yet locked apartments, car doors, or vaults are not beyond the reach of the government, especially should legal entry become warranted. This simply is not the case for encrypted information.
Consider the iPhone. Most iPhones are passcode protected. Those passcodes are, generally, four or six-digit strings. As such, most iPhones and the litany of personal information they contain are protected by a relatively short password. Using a program, one could easily try all possible combinations of digits until the correct one is identified. This technique is known as “brute-forcing.” To prevent brute-force attacks, Apple has integrated two key safety measures into the software that runs the iPhone. First, after entering an incorrect passcode the software introduces a delay before one can enter another passcode. Second, there is a “wipe” feature that one can enable. If enabled, the feature permanently erases all of the data on the phone after ten incorrect passcodes are entered in a short period of time.
The philosophical divide between the US government and the tech industry is pinned to these unique security considerations. On one hand, Apple feels that failsafes in iPhone software are necessary; to leave them out is to leave paying customers vulnerable to hacking threats. On the other hand, the FBI and other law enforcement agencies view these security measures as black holes from which potentially valuable information is unlikely to ever return.
It is important to note that the FBI is not asking for Apple to remove the security measures the iPhone currently employs. Rather, the bureau is asking for another way to retrieve the information they believe is stored in Farook’s phone.
Apple believes that creating the backdoor the bureau desires is to essentially weaken the entire security apparatus of the iPhone. To Apple, a backdoor is not a case-by-case accessory; having the power to skirt security measures for one iPhone threatens the information stored on every device in circulation.
Apple even believes that creating a workaround it controls and allows law enforcement to use, as needed, is untenable. Their reservations stem from the realities of the current investigative paradigm whereby access to encrypted data is granted on a case-by-case basis. This, they believe, makes it incredibly difficult to draw the line that denotes what is — or is not — legal or ethical. By casting this issue as an ethical concern, Apple has opened the discourse to emotional considerations that can potentially foster ideological extremism. This extremism stems from the transition of policy-based questions to intent-based questions, wherein arguments are inherently laced with ad hominem language.
The investigation that has taken place in the aftermath of the San Bernardino terrorist attack has illuminated key flaws in the process of addressing complex questions regarding encryption and personal privacy. Throughout this case, both the US government and Apple have pursued efforts in line with their philosophical leanings. The FBI has compelled Apple to act with a court order, a judicial system product that, as such, is granted in an arena devoid of debate and public discourse. This single-mindedness is off-putting to tech industry, especially given that this case is relatively unchartered territory. But on the other hand, Apple’s total rejection of the FBI’s wish is fighting fire with fire.
This case demonstrates that the current laws regarding government access to encrypted data can no longer stand. The case-by-case nature of the current legal climate surrounding encryption must be replaced. In its stead, concrete policy created by legislative discourse must arise. This discourse would allow both the tech industry and law enforcement to air their concerns and more fully plumb the depths of this immensely complex issue. Ironically, leveraging the power of the legislative branch would actually foster the creation of a middle ground between two ideological extremes.