Edward Snowden’s revelation that the US government was spying on millions of communications between civilians sent shock waves through Silicon Valley. Major technology companies that had often been complicit in the surveillance program, such as Facebook, Google, and Apple realized the full extent of government spying and faced public outcry over the lack of user privacy. They responded swiftly with heightened security measures; now, the Apple iPhone’s iMessage and Facetime, Facebook’s Messenger and WhatsApp, and Google’s Gmail, among other apps and services in the tech industry, use end-to-end encryption. In essence, end-to-end encryption ensures that companies are not able to break the encryption on their own users’ messages. Only the sender and recipient — the two “end” points of the information transaction — have the “key” to decipher a message.
If the NSA knocks on Yahoo!’s door requesting information with the threat of a $250,000 per day fine for noncompliance, as the NSA did last year, Yahoo! doesn’t even need to refuse. They can respond, correctly, that they simply don’t have the information. This new security method has made government surveillance more difficult, although certainly not impossible (formal requests for user information are hardly the only means of intelligence-gathering) and has affirmed company user privacy agreements. However, end-to-end encryption faces firm opposition from federal agencies and the threat of legislative regulation.
In November, UK Home Secretary Theresa May announced the Snoopers Charter, a proposed draft of the Investigatory Power Bill, which aims to update existing information communication regulations in light of new technologies. For months, many in the tech industry feared an outright ban of end-to-end encryption in the bill. The final piece of legislation is more nuanced, but serves the same ends of opening user information up to government access. Section 189 of the Snoopers Charter declares the Secretary of State may issue orders to companies “relating to the removal of electronic protection applied…to any communication or data.” In effect, the government would be able to order tech companies to remove end-to-end encryption or, more likely, ask Facebook, Google, or Apple to reengineer end-to-end encryption to provide a “back-door” for government intelligence agency access.
Currently, there are no similar proposals on this side of the Atlantic, but the US federal government has voiced similar opposition to end-to-end encryption. FBI Director James Cromey and Deputy Attorney General Sally Ouillian Yates recently testified to the Congress on this very issue. Cromey provided the amusing description of end-to-end encrypted messages intercepted by the government as “gobbledygook.” Yates spoke more firmly on the issue. A mandate on companies using end-to-end encryption “may ultimately be necessary,” she said. Noting that critics of the Snoopers Charter and policies like it often assert that engineering a “backdoor” is not possible, Yates responded, “Maybe no one will be creative enough [to solve the problem] unless you force them to.”
Efforts to pass regulations in response to new security technology could, however, run into legal and constitutional roadblocks. End-to-end encryption may be defended under the Fourth Amendment right to privacy against unreasonable search, as wiretapping often occurs without proper warrants on civilians who are not suspected of being involved in criminal activity. A 2013 Supreme Court case on this grounds was dismissed, but simply because the plaintiffs could not prove they had been wiretapped. End-to-end encryption puts barriers on mass government surveillance and, therefore, may be defended as a means of ensuring Fourth Amendment privacy.
Issues of government-enforced decryption may also jeopardize Fifth Amendment protection against self-incrimination. With end-to-end encryption in effect and companies unable to comply with law enforcement orders, there have been requests in criminal cases that the accused decrypt their own phones, computers, or individual files for evidence gathering or be held in contempt of court for “obstruction of justice.” The question of whether decryption is a form of self-incrimination has yet to be decided definitively. Cases on the subject have vacillated back and forth on the issue. Jason Grimmelmann, a University of Maryland Law School Professor, has said the decision comes down to whether police have a justifiable reason to demand decryption, “If the police don’t know what they’re going to find inside,” he says, “they can’t make you unlock it.” Mass surveillance can similarly be cast as a blind search for incriminating evidence at the expense of users’ Fifth Amendment rights.
Proponents and apologists of government surveillance often assert that these rights to privacy are superseded by the indefinable and malleable concept of the state’s compelling interests, including national security and public safety. On these terms the debate can devolve into an argument of values in which little ground is gained by either side. Perhaps, a more compelling argument against end-to-end encryption regulation is that it’s bad policy, and that it stands against the state’s compelling interests.
As previously mentioned, in response to government requests for a “backdoor” into encrypted user information, technologists and technology companies have responded that it’s not possible without severely comprising the overall security of end-to-end encryption. One analogy that’s been used in this argument is that “there’s no way to outfit a safe with a backdoor that only the FBI can open.”
The wave of tech companies employing end-to-end encryption is not solely a response to the Edward Snowden leak. It can also be read as a general response to the state of cyber security, in other words, the dire state of cyber security, in which hacks have become “when” rather than“if” questions. This is not the time for the government to be mandating that companies scale back their security measures.
If we are considering the compelling interests of public safety, the threat of cybersecurity fraud and theft is mounting and should be prioritized by the federal government, not purposely exacerbated by requiring major tech companies to collect massive stockpiles of data whose security has been deliberately compromised. And certainly, after reflecting on this summer’s OPM breach, in which the social security numbers of over 22 million federal workers were stolen, federal agencies are hardly on firm footing when demanding major alterations to Silicon Valley’s cybersecurity infrastructure.