The battle between the US Government and Apple about whether the Government can force Apple to create a “backdoor” to the iPhone is on pause for the time being, since the FBI announced it successfully hacked the iPhone and would help other law enforcement agencies around the country hack in as well.
But while the FBI won this particular skirmish, the broader ideological war wages on with no end in sight. For instance, the US Government has complained that WhatsApp, an app owned by Facebook that allows users to send calls and text messages over the Internet, uses encryption technology that prevents law enforcement from listening to communications even with a wiretap order. It wants this type of police-proof encryption to be made illegal. But should such technology be banned? It’s not an easy question and opinions are divided.
There are really two distinct questions and society has yet to arrive at a decisive answer to either one. The first is whether encryption in general is a good or a bad thing. On the one hand, some argue that encryption empowers criminals and terrorists by allowing them to operate in secret and by frustrating law enforcement. On the other hand, law abiding citizens also want/need encryption to both protect their privacy and prevent themselves from becoming victims of fraud. Industries such as banking could not survive without encryption. While encryption can empower bad actors, it is difficult to imagine a society that outlawed encryption altogether being an improvement over our current one.
Assuming, then, that various groups in society need strong encryption technology, and that such technology should not be outlawed, the second question is how/whether law enforcement should be able to bypass encryption. This second question contains both a normative component and a logistical component. The normative issue is whether people should effectively be required to keep a record of their private dealings that law enforcement can access. To use an imperfect analogy: would it be a good idea to outlaw shredders, given that they indisputably frustrate law enforcement to some degree?
But the logistical component is even trickier. Even those who think law enforcement should unequivocally be allowed to bypass encryption with a court order do not have a compelling answer to the practical question: if companies can and should be able to create powerful encryptions, and if law enforcement should be able to bypass such encryptions, how does this actually work? Should companies be required to design special work-arounds for the police as was requested in the Apple case? The creation of such “master keys” could effectively undermine the value of the encryption to begin with, as hackers may easily be able to steal the keys. Technology company employees could become targets for kidnapping and imprisonment around the world by various actors who seek access to the key. At best, companies will be subject to gratuitous and unprincipled orders in every country in which they operate. Additionally, even if the key is initially kept secret, once a criminal charge is laid, every defendant affected by the key search might be entitled to a copy. In some cases, such “master key” technology may even be difficult or impossible to create. Counsel for Apple raised similar arguments in the now-moot iPhone litigation. Further, if the FBI or its surrogates crack the iPhone, should Apple be told how they did it? Can the state use tax dollars to reduce privacy without sharing its hacking tricks with the technology companies?
This issue is not likely to go away. Technology marches onwards and documents more and more of our interactions. Should the result be a windfall to law enforcement or should individual privacy be the overriding value even at some marginal cost to security? Tough debates lie ahead.