Fog of CryptoWar
The Fog of Cryptowar - Why it’s not about crypto regulation.
Over the last two years, politicians in the USA, UK and elsewhere have been threatening the regulation of strong cryptography. But the experts and journalists who have expressed concern over this have done so in ways that we consider misleading. In this document we will recap the motives and strategies of the people who wish to regulate cryptography, the responses by its defenders and the battle over public opinion.
We will conclude that the picture painted in media is misleading (as are those of experts and activists) and would lead us to resist a straw man while missing the issues of substance.
Governments move towards the regulation of the use of cryptography.
Since the San Bernadino terror attacks in December 2015 and the debates about decrypting the attacker’s iPhone that followed it, numerous statements by politicians and law enforcement officials have reignited the fears about another “Crypto War” in which governments would use regulation to undermine the implementation and use of cryptography.
President Obama (USA) was quoted saying: “I will urge high-tech and law enforcement leaders to make it harder for terrorists to use technology to escape from justice.”, similarly Prime Minister Cameron (UK) said that law enforcement must “…ensure that terrorists do not have a safe space in which to communicate… We must … ensure that, in every case, we are able, in extremis and on the signature of a warrant, to get to the bottom of what is going on.” Last but not least the UK Home Secretary Umber Rudd stated that “End-2-End encryption is completely unacceptable.”
It is thus not surprising, that journalists and experts on encryption were reminded of the first “Crypto War,” which was waged in the US in the 1990s when the government tried to suppress the development and export of strong cryptography products, and establish a key escrow mechanism for communication (the “Clipper Chip”).
Following the playbook of the 90s, journalists, experts and activists were quick to respond with the same arguments against crypto regulation then were used before, culminating in Bruce Schneier being widely quoted with saying that a “Ban on strong encryption threatens to destroy the Internet.”
Such responses, however, especially when fueled by shallow media reporting and social media, may not be the right response when it comes to shaping public opinion, and may entirely miss the bigger picture.
Politics as Negotiation.
Before entering the debate it is crucial to understand how politics, media and public opinion interact, and how communication strategies are employed by politicians to win their argument.
A common misunderstanding, especially when technologists interact with those making public policy, is that legislation should be built on objective facts that can be conclusively agreed upon. This “engineering view” misses the reality of politics.
Most public policy only has to stand trial in the court of public opinion and support by interest groups, not be defended in the realm of science - nor is legislation confined to the technicalities of its field, but is evaluated in a much broader context.
When talking about politics, and specifically government action, it’s important to remember that governments and their administration are not monolithic entities. They consist of groups with varied interests. The positions of the ministry of the treasury and the ministry of the interior may not align, and the demands of law enforcement agencies are often opposed to the evaluations of intelligence agencies. Only when these interests align can effective action can be taken.
Politics has to deal with trade-offs and must convince the public, interest groups and media that their legislation is better suited to achieve the stated goals than competing suggestions. Dealing with trade-offs means that decisions are never perfect in the sense that any interest will be fully satisfied. Instead, one tries to discover a solution that harms the various competing opinions the least. In this sense politics is utilitarian. The trade-offs depend on value judgments that are informed by the ethical and economic perceptions of the day.
This makes politics inherently fluid, not dogmatic.
The decisive factors on which legislation can be enacted are the endorsement by relevant interest groups and the concession of public opinion. Politics thus has to signal its support to those interest groups while at the same time shape public opinion through debate. This means that political actors must be strategic communicators, not primarily communicators of facts.
This results in a perspective that politics is really something like a tug of war with many ropes at once - with the public standing in the middle. The opposing parties now compete over the support of the public and those interest groups that are necessary for any legislative action. Good arguments sway people to support one’s side, while bad arguments leads to loss of allies from the public. Part of this competition is to pull the correct rope.
Most politicians know how to play this game. They might not be experts in any field themselves, but they must have the ability to select experts that can inform their position, and be able to network support while not leaving the big picture out of their view. From that position they enter into the debate.
Since politics is about strategic communication, proponents of one side often take an extreme position so that the resulting compromise is as far in their field as possible, or they present straw men to keep their opponent from interacting with the core issues of the debate.
We believe that both tactics may be a fundamental part of the debate about crypto regulation, otherwise the debate would have been over two decades ago. Furthermore the debate has been almost confined to the topic of cryptography while the problem stated by politics and interest groups like law enforcement spawns a much wider field.
It is this strategic communication that tempts some pro-crypto activists to repeate oversimplified arguments. We will address some of those arguments and the technical aspects of regulation in this field.
Bad arguments.
The reason for this (shallow) treatment of politic realities is that in the days of social media shallow arguments aren’t recognized by many participants in debates. The journalists, engineers, programmers and cryptographers in the debate tend to take positions that in the context of a realistic view on politics fail to be relevant.
Three lines of argument are especially noteworthy:
-
“It is impossible to regulate cryptography.” or “Banning cryptography is like banning math.”
This argument misses the point in confusing the knowledge about cryptography with the wide-spread use of cryptography - or more specifically the use of cryptography to protect confidentiality. While it would be beyond the reach of governments to remove the knowledge about cryptography from the public sphere, it is certainly not impossible to threaten those that employ “illegal” cryptography with sanction.
This is exactly what happens with most regulation: Speed limits do not prevent the thought about driving fast, instead they address actually driving fast. It is behavior that is regulated, not thought.
So, similar arguments about the futility of regulation or the impossibility of enforcement aim a bit too high. Regulation does not require perfect adherence. Often it is enough if some people adhere to the specific law, and others can be punished in case of being caught. Again, speed limits are not perfectly enforceable, but they limit the number of drivers that drive recklessly, and it allows taking action against some drivers and thus nudging other drivers into compliance.
-
“Without cryptography, modern e-commerce is impossible, and the Internet would break.”
Again, this argument misses the point. It confuses the whole field of cryptography with the specific uses for authentication and integrity protection. It is certain that the lack of authentication and integrity protection would destroy e-commerce, and make it impossible to operate the Internet as we know it securely. However, the position of crypto-regulators is to undermine the use of cryptography for the purpose of confidentiality. Granted, in today’s technology we rely on confidentiality for authentication and integrity, we transmit passwords and credit card information, which would become very risky without the use of traffic encryption. However, nothing but convenience actually forces us to use passwords or credit-cards, since other means of authentication and integrity protection exist that do not rely on confidentiality.
Deploying these methods is possible, though it would incur substantial investments and many opportunities of error.
-
“Any form of regulation makes cryptography insecure.”
This claim is a good example for not appreciating that any regulation exists in a much wider context. Of course regulation will make cryptography less secure than it could be. But since security is a gradient, public opinion and politics might be willing to accept less security to accomplish some other goal. Only few proposals for cryptographic regulation are so fundamentally flawed - or so specifically worded - that they would undermine all security benefits of cryptography.
For now it is sufficient to recognize that these arguments have something in common - they all make absolute claims. Too much of the pro-cryptography activism centers around painting a black-and-white picture that is simply inapplicable to public policy debates unless one deals with the content of social ethics directly. Crypto regulation is not part of social ethics yet, and maybe there lies a hint that it should be. But in that case, the debate must be fundamentally re-framed.
What these absolute claims also hint at is a mode of thinking that is common in the information technology sector. Computer scientists, and a lot of the pro-crypto activists are - or are informed by - computer scientists, seem to have a tendency to express binary thinking that demands perfection or surrender when it comes to problem solution.
In combination with a lack of understanding how politics actually operates, this too often results in arguments that can be undercut because of their absolutist claims, failure of stating the problem, or that are simply paternalistic. One thus often finds variations of above arguments combined with comments like “We should take measures to improve tech-literacy amongst the authorities” (Awn Umar) or “My impression was that primarily she doesn’t know what she is talking about” (Paul Bernal, speaking about Teresa May, UK PM). The risk of this form of engagement is that the pro-crypto position could be easily marginalized, or kept occupied with a narrow aspect of the debate that is not the main thrust of novel regulation.
While we are very sympathetic to the position of defending the right to cryptography without reservation, it is our intend to warn fellow pro-crypto activists from appearing snug or disconnected from the public at large. This debate is not about convincing high-IQ people like yourself that are specialized in some field of computer science, mathematics or engineering. It is about dealing with a wide variety of people with very different backgrounds and an awareness for the necessity of trade-offs. Bad arguments, over-simplifications, apocalyptic visions and technocratic demeanor might get citations in press, but they fall in deaf ears in politics, law enforcement and regulatory bodies.
Instead, it is necessary to engage with the opposing party, understand their position, their options, and their possible agenda.
Motives for regulating cryptography.
The quest to regulate cryptography has had several motivations during history. The earliest forms were informed solely by military needs to access the secrets of enemies. While this still plays some role, the motive for the current debate lies in the so called “Going Dark Problem”.
Since the advent of mass individual communication technologies (telegram, phone) and wide spread information processing technology, law enforcement and intelligence agencies became accustomed to interception and recovery of large amounts of incriminating data without having to invest too much man-power or risky infiltration. In parallel to the scientific, technological and digital advance in all fields of life the methods of investigation and intelligence gathering shifted from human-driven (HUMINT) to technology driven (SIGINT, ELINT, etc.) methods. This also fostered a growing disconnect between enforcement agencies and population, as well as a relative decrease of officers active in the interpersonal nuances of police work. The same is true for intelligence agencies.
This vision of police and intelligence work has been reinforced both by politics demanding more substantive information and media portraying modern investigative work as a combination of cutting edge forensics and all-knowing computers, cell phone tracking and Internet tracing. Since we live in a mass media society, these media portrayals both in news coverage and entertainment products has shaped the expectations of the population. It is quite common these days that juries put all emphasis on forensic evidence while ignoring other information, and being shocked when the forensic results are less comprehensive and clear-cut as they have come to expect from shows like CSI (Crime Scene Investigations, a cop show centered on forensic experts).
By now global, instant, widely available and cheap communication is a normal part of life, and almost everybody owns personal computing devices (PCs, laptops, tablets, smart phones, game consoles) that outperform anything that was available just a few decades ago. This technology is of course available to criminals and other targets of law enforcement and intelligence agencies, as it is to anybody else.
At the same time cryptography became widely available. Since the 1990s anybody can, with some effort, use cryptography that is practically unbreakable. Indeed, cryptographic protection has become so wide spread that it often goes unnoticed. It protects our online shopping, credit card transactions, cellphone calls, and a myriad of other applications. The modern world is unthinkable without cryptography protecting the integrity of data and allowing us to authenticate to remote systems for a host of useful purposes.
However, it was the advent of individual use of cryptography to protect the confidentiality of communication that ushered in a new time. Instead of using cryptography only in the interaction with companies and the state, cryptography is now widely used for the protection of personal computer data storage and inter-personal communication. Every major operating system today ships with tools for hard disc encryption, and a whole host of messenger services offers encryption of communication directly between the persons that want to talk with each other, without relying on the security of the provider itself. Again, criminals are amongst the many users.
Now police and intelligence agencies are increasingly confronted with communication they cannot tap anymore, and personal notes they cannot decrypt anymore. Them confiscating computers and smartphones is no longer a guarantee for gathering evidence that would stand up in court.
The situation is even worsened by the wide availability of anonymous communication tools like Tor, I2P etc. Now not just the content of communication becomes virtually inaccessible, but also the fact of who communicates with whom.
This quick and wide-spread individual use of cryptography has an increasing impact on long-cherished investigative methods, leading to more and more cases that cannot be solved or that don’t lead to convictions in court. Sources of information that were long relied upon are now “going dark”. Of course, law enforcement opposes this development. They want their work to be as easy and effective as possible. But it is also a development that receives critical attention in the public discourse. Many people are not willing to accept the laughing criminal that leaves the courthouse with a smile, simply because his computers could not be decrypted.
This is especially true when it comes to the two areas of crime that incorporate the notion of universal evil and use of cryptography like no other: Computer Aided Child Exploitation (CACE) and International Terrorism (IntT).
Cryptography has helped criminals of both kinds to cover their tracks and conceal evidence in many high profile cases. Due to the fact that both crimes are universally considered to be of the worst evil - exploiting and killing the random innocent - they fuel public outrage like nothing else. The public demands of law enforcement to prevent those crimes, and to bring the perpetrators to justice.
It is thus no wonder that the new debate about crypto regulation was initiated by law enforcement failing (for a while) to access the iPhone of the San Bernadino attackers because it was encrypted. After every major terror incident we now see law enforcement and politicians complain about information being inaccessible because of technical protections. Similarly cases of alleged child pornography consumers - who’s hard discs are so well encrypted that the court cannot rely on them for evidence and that therefor escape prosecution - repeatedly made the news.
In this context the outrage felt by many in politics, law enforcement and the public about the protections granted by cryptography is understandable, possibly even justified.
It is important to really grasp the core of what is going on here. Possibly for the first time have methods to keep evidence from law enforcement reached an universal availability and wide spread use. This is quite possibly a qualitative change of singular importance. All previous means of hiding from law enforcement were based on error prone wit or physical protections that could be overcome or fail randomly, or were simply not widely available. Effective means to oppose, or hide from law enforcement have previously been banned from personal use - like effective body armor (in many countries), guns, doors that could resist police raids, forged identification papers, face masks…
It is in light of cryptography providing effective limits to court orders and warrants, and the history of previous regulation to make law enforcement effective, that now regulation on cryptography is demanded.
The main question remaining is if and how cryptography can be regulated without causing too much collateral damage to the societal uses of cryptography.
It is necessary to stress that these positive uses exist, and are widely accepted. Even law enforcement and intelligence agencies have no interest in making cryptographic protections disappear completely, simply because they prevent a whole host of crimes every second and protect secrets of national importance against foreign spies. It could reasonably be said that everybody today loves and relies on cryptography, except for those few cases where it prevents the enforcement of law.
This must lead to the realization that the current debate since 2015 is fundamentally different from the first Crypto War in the 1990s. The goal is not, and cannot be, to snatch strong cryptography from the hands of people. Instead, the current debate is about making the secrets that cryptography protects accessible to law enforcement. This is no minute point since it deeply shapes the approach that regulators take, and it is therefor the point with which pro-crypto activists must engage. Failing to see that the goal is access to the plaintext confines arguments into a space that is neither relevant nor commonly understandable for public opinion. While it is certainly difficult to appreciate this difference from the perspective of cryptography it is nevertheless substantial - because it allows for very different technical implementations and legislative action.
Insisting that plaintext access is the same as banning strong cryptography misses the point and excludes pro-crypto activists from the debate. Instead one has to engage with cryptography in the actual technical context, including the hardware it is run on, the operating systems, networks, and current structure of service providers.
Access to plaintext is of interest for law enforcement primarily in four forms:
-
Data at Rest.
This refers to data that is stored on the user’s local computer or phone. Cryptography here hinders access through device and hard disc encryption. Device encryption is by now a common feature on smartphones, and all major operating systems for personal computers include software to encrypt the local device, including full disk encryption which prevents all data except for the bootloader from being understood by anybody who does not have access to the user’s secret key or password.
-
Data in Transit.
The contents of communication between two or more parties that is carried by telecommunication networks, especially the Internet. Previously wiretaps would reveal this information, but with the use of encryption a growing part of the contents of Internet communication cannot be understood by anybody that does not have access to secret keys held only by the communication partners.
- Data in Cloud.
A growing amount of data falls in between the “Data at Rest” and “Data in Transit” categories because it is stored remotely with cloud service providers. While the data is readily available to law enforcement through subpoenas and warrants, an increasing amount of data in the cloud is now encrypted. The cloud also serves as a means to transfer data between multiple parties without retransmission from the local device. Cloud data is especially valuable to law enforcement because it contains local device backups and histories/logs of many services. The contents of email accounts should be considered “Data in Cloud” as well.
- Data in Processing.
This is data currently processed by a device and located in the volatile memory or temporary files.
In security terms the data at rest is the easiest to protect, since an attacker needs physical access to the device.
Data in Transit is more readily available since it travels a number of links on the Internet - including potentially insecure wireless networks - but apart from attacks to the user’s own local network most data is not readily available to the common criminal (though law enforcement wiretaps and intelligence agencies surveillance is commonplace).
The least easy to protect data is the data in cloud. Numerous successful hacks on cloud providers and enormous data leaks every few weeks attest to that.
In all three cases encryption serves as a meaningful way to secure data against unlawful use. Personal devices are stolen frequently, wireless networks are sniffed easily, and cloud storage providers are a juicy target for any hacker.
The “Data at Rest” and “Data in Cloud” are similar from the point of cryptographic protection because both only require a single instance (the user) to have access to the key to encrypt/decrypt the data. “Data in Transit” however requires that both sender- and recipient share keys, in practice both even share a single secret key valid for a communication.
Law enforcement would like to get access to all three forms of data, and precisely to the plaintext (unencrypted) content. Access is of interest in two time variations:
- “Realtime access”.
This is the equivalent to the old wiretap. Law enforcement would like to record/listen in to communication while it happens. This applies directly only for Data in Transit. For realtime access Data in Cloud is sometimes an option if communication tools use the cloud to store conversation histories, or to access email communication.
- “Post-Fact access”.
This is equivalent to a regular search warrant. Law enforcement would like to access data stored on the local device and in the cloud. This is the current focus of the debate, where after a crime has been committed police is searching for evidence to present in court, and information that would produce new investigative leads. Furthermore law enforcement usually seeks access to communication data logs stored by providers.
Realtime access to both Data in Transit and Data in Cloud are most controversial in the debate. Both must be considered surveillance and happen without the affected person knowing about it. Post-fact access to devices in possession of a suspect is much less controversial since it is equivalent to the standard search warrant that is commonly accepted by the public.
Post-fact access to devices is also least controversial from the point of view of international investigations. It only rarely requires action within more than one jurisdiction, and time constraints usually allow for legal processes to be adhered to. This is also reflected in the fact that international standards for digital evidence collection are debated and agreed upon in various international forums - notably the EU and the G20.
For post-fact access to devices many issues of international cooperation and multi-jurisdictional applicability of law are much clearer and easier to solve than for realtime access for data in transit, or any access to data in cloud. This is of particular import for law enforcement because here the goal of investigations is usually the presentation of evidence that stands up in court.
It must be noted however that two hybrids between the time variations and storage forms exist that has no previous parallel in analog technology:
- “Realtime access to Data in Processing”.
This is of interest for capturing communication data before it is encrypted and transmitted. In essence this would constitute a tap into the local device - a “telephone bug”.
- “Realtime access to Data at Rest”.
Law enforcement might want to search a device remotely that is in the possession of a user. This differs from the case of a usual search warrant because the user would not be aware of such a search while it happens, and thus be delayed - or even prevented - from legal recourse.
Both of these methods are controversial because of their hidden nature and the necessity of remotely exploitable security bugs in the user’s device or a pre-installed backdoor. Nevertheless they are already considered, or even codified, in several jurisdictions because they prevent evidence from becoming inaccessible through encryption, or the loss/destruction of the user’s device.
A further issue with realtime remote access to a user’s local device is that it poses questions for the admissibility of evidence. The precise targeting of the device, as well as the ability of such a remote access to modify data without trace, should cast doubt on any data collected. Both errors and undetectable falsification can occur and are extremely difficult to prevent - if they can be prevented at all.
The above points mostly concern law enforcement investigations. They do not equally apply for intelligence agencies (IAs) for a couple of reasons:
-
Intelligence Agencies often fall under special legal regimes. Due to the fact that they often have no direct law enforcement powers, and because they often operate outside their home jurisdiction they are imbued with special legal privileges that restrict their methods much less - amongst them not requiring previous legal codification of the methods they might want to employ.
-
Thus, IAs have the ability to directly hack, steal or manipulate devices.
-
IAs frequently pressure, infiltrate or hack service providers.
-
IAs are far more concerned with the action of foreign hostile actors. They not only are interested in stealing the secrets of foreign governments, but also want to protect their own government and key industry against attacks by the same. This puts them in the double position of being both offensive and defensive in their activities. Globally undermining cryptography in a transparent way could potentially backfire and harm their mission.
-
IAs do however try to covertly undermine cryptographic research and algorithms in such a way that the weakened products are only attackable by themselves. This is a very risky game, especially when discovered, or if the secret knowledge that is the foundation of such an asymmetric weakening becomes known.
As such intelligence agencies are not the primary actors in the cryptography regulation debate. They either choose to abstain from the topic altogether, or only partake in the debate in a rather covert way. It is also not unlikely that they might occasionally put their defensive purposes first and thus become temporary - and questionable - allies of pro-crypto activists.
No treatment of the “going dark problem” and the interests of law enforcement and intelligence is complete without highlighting a part of the debate that is all to often conveniently omitted by politician’s speeches.
The spread of digital communication - encrypted or not - has lead to a new plethora of information that is available to relevant parties already. This is the whole realm of metadata - data about data - or specifically here: Data about communications. Police and others now have access to a depths of information of who communicated with whom, when and how long, as well as location data of nearly every communication device that is powered up. The range of devices that constantly produce this kind of data is growing daily, from mobile phones to cars, power meters to TV sets.
Metadata has contributed as much to changes in law enforcement as the “going dark problem” has. New methods of investigation, often very effective, have become available. And contrary to content data, metadata lends itself to automated processing and analysis - leading to new problems like global mass surveillance.
Interestingly these new opportunities for law enforcement and intelligence, and those new threats to citizens’ privacy, do not appear in the calls for crypto regulation. We shall come back to this issue below.
Technical aspects of crypto regulation and plaintext recovery.
We shall now give an overview of means by which regulation of cryptographic applications could potentially soften the Going Dark problem. Afterwards we will look at challenges that impact all regulation attempts in this field.
Means to regulate:
Outlaw strong algorithms
The first attempt at regulation of cryptography has been the outlawing of strong cryptography and forcing users to rely on algorithms that could be broken by intelligence agencies, and potentially law enforcement. This approach is off the table today because the knowledge and processing power to attack those algorithms is, or would, be widely available. Most governments, and not few corporations and criminal organizations would be put into the position that they can intercept and decrypt communications and therefor have access to secrets globally. This would put economies and nations at an unprecedented risk in a world that is shaped, and relies on, international secure communication. We simply rely on strong cryptographic algorithms to deal with the risk of computer break-ins, espionage, cyber war and computer crime.
The same applies to variations of weak algorithms like limiting the effective key size, transferring keys through side channels or oblivious transfer, or mandating predictable random number generators for key generation.
Manipulate strong algorithms
There have been attempts by intelligence agencies (specifically, the NSA) to manipulate algorithms so that their strength relies on the secrecy of hidden, or underhanded, parameters. This approach reduces to the outlawing of strong cryptography since the secrets on which the security of the algorithm rests would have either be spread widely to be used by law enforcement, or everybody would be at the mercy of the party that knows those secrets. While it might be an interesting method for a single intelligence agency, it would fail in softening the Going Dark problem in a meaningful way and at the same time create a power asymmetry that dooms the acceptance of such a scheme.
Undermine Protocols
Some recent statements by politicians (esp Rudd, UK SI) have hinted at making end-2-end encryption illegal, especially for messaging services. This would result in protocols that may provide confidentiality between user and provider, but not between user and user when communicating through a provider.
Similar suggestions exist for mandating that all communication systems should retain interception capabilities. As with the Lavabit case, law enforcement might rely on the cooperation of the provider, or the sharing of identity keys with law enforcement to enable man-in-the-middle attacks, to gain access to data in transit.
This is a possible approach for regulation since it leaves most of the existing infrastructure in place and puts all liability on the communication providers and intermediaries - as is already the case with lawful-interception legislation for telephone etc.
The products most affected by this variant would be those that try to offer secure communication that is inaccessible by anybody but the final sender and recipient. Those products include instant messaging services, voice over IP telephony, video conferencing and encrypted email (PGP/SMIME). Furthermore Virtual Private Network links would fall under this approach, even though they don’t rely on an intermediary.
Making end-2-end encryption illegal means that all security rests on the communication provider or intermediary, and potentially also on the certifier of keys that the parties require for mutual identification and integrity protection. Especially big global providers are thus put into the cross hairs of hackers and foreign governments since they present a treasure trove of valuable information.
Auditable communication
Auditable communication is used, and often mandated, in some industries already, like banking and high-security environments in which traffic traveling through a local network must be inspected by security appliances. To enable this, security devices need a way to decrypt the traffic either by active interception and reencryption (man in the middle) or by using deterministic key generation whose secret is shared with the security appliance. This approach can be applied to any mediated communication that transits a provider as mentioned in the previous section. It is therefor nothing novel.
Key Escrow
Key escrow in the strict sense means that all keys (in this debate, confidentiality keys), must be shared with a trusted agent - like a government agency - before they can be used for encryption. In the case that encrypted data must be decrypted under a warrant, the police would then request the key from the agent and perform the decryption.
While possible to implement from a purely theoretical point of view, key escrow mechanism are inherently complex when deployed on a larger scale. It must be considered that there must be a secure way of transmitting the secret keys between the user and the escrow agent, and that those keys must be made accessible to law enforcement in some way.
Very naive approaches use only one additional, global key to secure this key transport. But this makes that gloabl key a secret on which the confidentiality of all communication within the domain of regulation would rest. The escrowed keys must be stored, managed and protected against unlawful access.
If recent history is any indicator, then building such a system even on a national scale is unrealistic. Many government agencies have suffered fatal data breaches recently, including the NSA (which is specialized on keeping secrets), the CIA (the same) and the Office for Personal Management in the USA. This list of breaches is far from being exhaustive, but it demonstrate the risk of a key escrow agent would face.
This risk is compounded by the fact that two conflicting requirements exist for an escrow agent. On the one hand he must protect all keys against unlawful access, on the other hand he must establish a way to share those keys with law enforcement in a timely manner. This makes it necessary to keep some form of the key digitally available and online -which in turn exposes that key to attacks.
To mitigate the risk of a single escrow key, some schemes suggest the use of splitting the user’s key between many key escrow agents that then have to cooperate to reveal the key. While the security of these schemes is higher, they also multiply the complexities and cost of such a system, especially in regards to deployment and operation.
Furthermore the process by which law enforcement can request keys from the escrow agent(s) must be secured and authenticated, meaning that law enforcement requires to have some form of authentication key that would be used to demonstrate legal access. Each authorized agency and office would require one of those authentication keys. However, since each of those keys comes with the ability to reveal an escrowed key from the agent, the security of a key escrow scheme would rely on the secrecy of each of those authentication keys.
Additional problems like secure key rotation, availability of the agent, and cost of operation would likely turn this approach into the biggest and most complex government mandated information system project in history. The risk of failure to deploy, security breaches, and the cost of operation make such an approach unrealistic.
Another problem of key escrow systems is the scope in which they are to be deployed. If they are deployed as a global infrastructure, the management and regulation would require global political coordination. If they are however deployed on a national scale, they would require some means to enforce the specific demands of the jurisdiction on the user’s device - like choosing the transport key of the national key escrow agent.
A further problem of key escrow mechanisms is that they conflict with cryptographic best practices, especially Perfect Forward Secrecy. Here a new key is generated for each message and old keys are immediately destroyed. This ensures that a leak of keys does not put all communication at risk of being decrypted, but only the communication during a short time frame for which the key was stolen. Key escrow systems however require that keys are shared with the agent which both introduces a long-term storage of secret keys that can potentially decrypt the communication of years and an enormous amount of communication between user and escrow agent since every new key needs to be escrowed.
Another best practice that is incompatible with key escrow is the use of authenticated encryption. Here the same key is not only used for confidentiality, but also for integrity protection (and indirectly authentication) of the communication. Sharing this key with an escrow agent would allow the agent to not just read the communication, but also manipulate it without the original parties being able to detect this. Which means that not only confidentiality of data is at risk, but also the security of the communicating devices.
Instead of the user generating a key and then sharing it with the escrow agent, the escrow agent could also generate keys for the user. This suffers from the same problems, but introduces an additional one that the security of all keys relies on the security of the key generation method employed by the escrow agent. Implementation mistakes in cryptographic algorithms are commonplace enough that this could potentially lead to a situation in which the security of all keys is undermined but without anybody being able to detect it - except for a successful attacker.
Advances in cryptography may also lead to key escrow becoming much more secure. For example, various proxy re-encryption schemes could be employed to mitigate many of the security problems of previous approaches and reduce the complexity of implementing key escrow.
Content Escrow
Instead of encrypting data end-2-end between the intended sender and recipient only, a third party (called agent) can be introduced to which all content is encrypted. Various protocols exist that make this possible and enforceable, as long as at least one of the original parties is honest. The communication can then be intercepted by regular means and decrypted if the need arises.
Content Escrow schemes allow the continued use of some forward secrecy mechanism as long as the agent actively supports them.
One additional problem of content escrow mechanism is that the agent plays an active role in communication, which increases the demands for reliability and accessibility of the agent. Should the agent become unavailable, this could (depending on the protocol) prevent communication which turns the agent into a single point of failure and would make it a prime target for denial of service attacks.
Key Recovery
Key Recovery schemes are similar to Key Escrow schemes in that they make keys available to a trusted third party. However, keys are not directly made available to an escrow agent to be stored, but instead require access either to one of the devices that communicate with each other, or realtime interception of the communication.
In key recovery schemes the confidentiality keys generated by the user are stored in a secure storage module of his device, stored in a remote cloud account, or transmitted with his communication. The keys are encrypted for one or more escrow agent keys.
Key recovery schemes have the same problems that key escrow schemes have, but they are less resource intensive because no communication with the escrow agent is required by the user. Instead the existing interception capabilities of communications providers are used only in those cases when a need for interception actually arises.
Key recovery schemes for data at rest, especially encrypted devices, are a seemingly attractive approach because any access to the secret keys would require access to the device as well as cooperation of the escrow agent(s). This could potentially satisfy part of the law enforcement demands without undermining security too much. However, the implementation of such a recovery scheme would require the creation and deployment of special secure storage modules in all relevant devices - current devices would not be covered.
A final note should be added concerning key escrow, content escrow, and key recovery. All these approaches are brittle in the sense that there is no guarantee that they will work when they are most needed. Verifying that such a scheme works in a specific case requires actually decrypting the data of interest. If such a verification is not undertaken frequently, these schemes might break without being noticed. However, this creates new legal problems since the interception and decryption of data for verification purposes is hardly justifiable by current standards of law. Attempts to verify those schemes by employing the (automated) cooperation of the communication partners only applies for data in transit, and always relies on the honesty of at least one party. Since these schemes are only considered to catch criminals (people that actively and intentionally break the law), such a cooperation cannot be assumed. It is this verification (among some other aspects) that doomed the famous Clipper Chip key recovery system that the USA tried to roll out in the 1990s. Since then, no substantial improvement on this front has been made.
Mandatory Key Discovery
Several jurisdictions (UK, indirectly USA and Canada, amongst others) have codified laws that are meant to compel suspects to reveal their secret keys and passwords to law enforcement or the court. If the suspect does not comply, fines and prison time await him.
This approach suffers from technical, practical and legal problems:
First, it is of no use if the suspect employed Perfect Forward Secrecy in his communication, or uses timed encryption for his storage devices.
Second, it is hard - and sometimes impossible - to distinguish between a suspect that is unwilling to reveal his keys and one that is unable to - either because he forgot or he never actually knew the keys (mis-attributed device, or hardware security token that has been destroyed).
Third, it is questionable if anybody should be mandated to produce incriminating evidence against himself. Since we are no legal experts, we must refrain from further judgments. However, the legal implications are deeply troubling.
Insecure default settings
It seems that one of the approaches that have been tried by both the USA and the UK is to influence software and hardware vendors to abstain from making strong cryptography the default configuration of their products, while keeping the capability in tact.
This attempts to at least catch the low hanging fruit, the fully incompetent criminals. Surprisingly, this might actually be a productive means since criminals in general are caught because of their incompetence - until they learn.
Remote Access Schemes
A prominent approach to solving the Going Dark problem is to allow law enforcement remote access to the device of a suspect. Various variations of this method exist which we will cover below. Common to those variations is that they suffer from three problems:
- Access control for the use of those remote access methods is a hard problem. Only law enforcement, and ideally only with a warrant, may be able to use them. Hackers and foreign governments must be excluded. This essentially mirrors some of the problems that key escrow systems have. There must be a secure way of targeting the device and necessary access credentials (or other secret knowledge required for access) must be securely managed.
As is evident from the NSA and CIA Vault 7 leaks, it is an enormous undertaking to guarantee this. Without such a guarantee, remote access schemes have the potential to undermine the digital infrastructure of nations, making it vulnerable to hackers and cyberwar.
From a purely national security perspective, this appears a price too high to be paid.
-
Digital evidence gathered through remote access, as already mentioned before, is of questionable repute. Since remote access would necessarily allow control over the target system any data on it could be manipulated and falsified, including the suppression of evidence or the creation of false evidence. Because all access happens in a covert manner, legal recourse is at risk, and because the access methods must be closely guarded for security reason, they cannot be revealed in legal discovery. This boils down to the necessity to simply trust the individual law enforcement officers to be honest - and that in light of cases in which police has planted drugs as evidence, and the proverbial “Saturday Night Special”.
-
Devices may be hard to assign to a jurisdiction. It is necessary to determine the actual location of a device before infiltrating it, otherwise the police of country A could break into a device in country B, leading to potential diplomatic turmoil. It is unlikely that a country like the USA would welcome the remote searching of a domestic device by the police of China or Russia.
Mandatory Software Backdoors
Government could mandate backdoors to be implemented in operating systems so that law enforcement can access any device remotely, given the necessary authentication credentials. This is highly problematic since it risks the integrity of all devices because of an intentional security hole. Securing the access credentials so that they do not fall prey to hackers and foreign adversaries would be an enormous, and potentially impossible task. Furthermore, since software and devices are shipped internationally, such a backdoor would have to be deployed per jurisdiction - potentially at the border. This is frankly unrealistic and dangerous beyond words.
In addition, the backdoor would also be required to be securely programmed in the first place to prevent exploitation even if there are no valid authentication credentials known. Furthermore the communication towards such a remote backdoor would have to pass through all firewalls on the way - meaning that firewalls need to be configured accordingly as well. This applies not just to corporations but also to standard users since off the shelf home routers come with enabled firewalls. Beyond that, the targeting and the reachability of the device must be guaranteed, even though NAT, and especially Carrier Grade NAT is widely deployed and doesn’t support uninitialized incoming connections.
This would mean that government has to deploy something like current malware that actively reaches out to a command and control server or network (C&C) to request instructions. This C&C would become a prime target for denial of service attacks, but also a great source to find out who is currently under investigation, counteracting investigative goals.
Lawful Hacking
Several countries, including Germany, the Netherlands, USA, have created legal frameworks to allow law enforcement to use existing security holes in deployed software to break into systems to remotely identify, search or tap them.
The main problem with this approach is that it requires that law enforcement has access to exploits - software that uses security vulnerabilities in the target to gain system access. These exploits are highly sought after knowledge, and with the growing demand by not only cyber criminals but also law enforcement, intelligence agencies and military, they become a tradeable good that demands increasing prices.
This creates a dilemma. On the one hand government has the mandate to protect its citizens (and that includes their computers) against crime and foreign aggression. On the other hand government needs to keep exploits secret because law enforcement relies on it to execute remote access for investigative purposes.
In addition to the problem of deciding which security holes to make known to vendors for patching and which to keep secret, the demand for exploits by government potentially creates a market that further erodes security because criminals are incentivized to introduce these vulnerabilities into software. For example, contributers to open source software, or employees of software companies, might be tempted to introduce exploitable bugs into software and to later auction exploits for them to the highest bidder.
Since these exploits often demand prices beyond 500,000 USD, this is a pressing risk - especially for open source software where contributors are usually not vetted and identified sufficiently.
One suggested escape from this multi-faceted dilemma is that government only uses security vulnerabilities that have already been made known to vendors but not yet fixed. For example, it is rumored that the NSA has access to the CERT feed over which vendors are informed about found vulnerabilities. While this softens the dilemma, it comes with its own problems:
-
The time to create and deploy the exploit code is significantly shortened, requiring that the government employs highly skilled and motivated experts that program and test these exploits around the clock. Again, those exploits should not fall into the wrong hands, but at the same time need to be quickly made available to authorized law enforcement entities.
-
Giving government access to a stream of vulnerabilities also means that potentially many more people gain that knowledge, risking leaks. Furthermore: How to decide WHICH government should have priority access to that knowledge, and what consequences does this have for national security?
At least the approach of using only 1-Day exploits (those vulnerabilities made known to vendors already) would contribute to drying up part of the market for exploits.
A variant of this method has recently become known. In some (unidentified) countries, internet service providers were enlisted to help the government in targeting specific users by infecting downloads with remote access trojans on the fly. So called drive-by attacks depend however on insecure usage practices of the user and are unreliable. They also suffer from mistakenly attacking innocents.
Targeted Updates
A rarely discussed method for remote access is the subversion of update procedures. All devices require regular updates to fix existing security vulnerabilities or deliver new features.
Update processes already inherently have the ability to change every part of the device’s software and they often provide targeting methods already - through device identifiers or licenses.
As such, they could be considered to be intentional backdoors.
Software vendors currently employ digital signatures to secure and authorize their updates. This method could however be used by law enforcement if software vendors can be convinced (or forced) to comply. It is certain that vendors would resist such a move vehemently, but they have also a record of previously cooperating, especially when it comes to third-party software delivery.
Both Google (Android) as well as Apple (iOS/iPhone) have already suppressed and forcibly deinstalled software from their customers’ devices, which allows for the assumption that they could also be made to install software - if government asks for it and a sound legal process for it is established.
Common Problems with various regulatory means.
In the following we will touch several open questions and problems that are common to all attempts to regulate cryptography, as well as engage with some of the arguments against it that are often repeated.
Regulation undermines security
All means known to us that soften the Going Dark problem lower the security of information systems and communication to some extend. This is to be expected, since the whole question is that of granting access to third parties that is not necessary for operation in and of themselves. Security thus must be lowered to include those parties even against the will of the user, therefor lowering the extend to which the user is able to control his devices and software. This is even further amplified by the fact that any approach will increase the complexity of the software and infrastructure - and complexity is the enemy of security. Fundamentally, security and control are synonyms in this field.
However, security is not binary. It is a gradient on which we pick a value in light of trade-offs like convenience and cost. The public policy decision to deal with the Going Dark problem is just one of these trade-offs, namely that of public security and enforcement of law.
That presents us with the question on how to balance individual control against the provision of (at least) the rule of law. This is no question of cryptography or computer security, but one of social ethics, politics and statecraft. It therefor has to be answered in that domain.
Within that domain previous answers have been to regulate gun ownership, doors that resist police raids, mandatory government identification schemes that enable identity theft, and TSA locks on luggage. For some special needs licensing schemes have been introduced, which could apply to crypto regulation as well - allowing unrestricted used of cryptography for some uses, like banking and e-commerce, while strictly regulating it everywhere else.
Our answer to the public policy question is radically on the side of individual control and security: Cryptographic protections, privacy, control over our devices and the integrity of information processing systems is one of the most fundamental requirements in a world that relies on international communication and data processing for national, economic and personal wellbeing. This is especially true in face of risks of cyber crime and cyber warfare. Lowering our defenses will make us even more vulnerable than we are already, potentially risking our critical infrastructure and personal autonomy.
Regulation undermines innovation
Regulation in all areas creates a cost of compliance that redirects resources from developing what the customer actually needs. This is a commonplace. Furthermore, regulation in this particular field prevents the use of best practices, which actually creates harm for the customer.
However, regulation in other fields is accepted since it is perceived to provide a public good that would otherwise not be created. Here the argument cannot be simply that of arguing against regulation - since it is otherwise widely accepted - but instead it must be questioned if regulation in the field of cryptography would create any public good at all.
In our survey of the various means to regulate cryptography it should have become clear that while a primary public good - security - is attempted, all known means to provide it in this field also come with enormous risks for security itself, making regulation self-defeating.
Certification
If any specific regulation on cryptography is established, the question of certification arises. Which software and hardware implementations implement the legal requirements, and how can a customer verify this?
This requires both setting a standard for implementation, as well as verifying individual implementations of this standard. The standard will also require timely updates as technology progresses.
Since software and hardware are global markets, international cooperation will be required. It is unlikely that the majority of relevant countries, so diverse as USA, Germany, Canada, Russia, China, India (to name just a few), can agree on such a sensitive topic - especially since the necessary mutual trust is simply not present. If they don’t, a fragmentation of the market or mounting legal risks will be the result.
Enforcement and Effectiveness
For any regulation to be effective it first needs to be enforced, or adherence will be very low.
Some pro-crypto activists argue however that enforcement is impossible, since it would require censoring all sources of illegal cryptographic software, apply protocol fingerprinting to detect illegal use, and that no big vendor would adhere to regulation anyways because of market demands.
We would agree if the goal were total enforcement. However, almost no laws are completely enforced. All complete enforcement requires totalitarian systems.
Instead, pragmatic politics is aware and accepts that enforcement is necessarily incomplete. This acceptance has three reasons:
- A large number of people will obey the law simply because there is a non-negligible risk of being caught and punished. As soon as enforcement pressure is high enough, most people fall in line. This is the case for copyright infringement on the Internet, speed limits, gun laws, and drug laws. Complete enforcement fails, but the behavior of the population is nevertheless shaped according to the goal of the law.
Unless the want for cryptographic protection trumps the probability of being caught and the punishment to expect, people will obey.
This is fundamentally influenced by how much cryptographic protection adds to the satisfaction of wants and needs, since it is no primary motivator for most people. For decades, strong cryptography was available on the market, but inconvenient to use. So the vast majority of people did not use it. Will the new inconvenience of enforced crypto regulation drive down use? It certainly will.
-
For those that break the law the risk of being caught will increase. This means that some of the people that hide their crimes with illegal cryptography will at least be caught for their use of illegal cryptography.
-
Law changes social perception. Lawbreakers must hide the fact that they break the law in front of people that might be law abiding. This leads to social ostracism which becomes a means of enforcement. Unless a behavior is widely accepted, this mechanism is highly effective. It is not unheard of that users of cryptography are asked justify themselves, most often with the old tome: “If you have nothing to hide you have nothing to fear.” Similarly, people that are suspected by their social environment to be criminals face a much higher risk of being reported to law enforcement.
It is no question that some criminals will not forgo the protection of cryptography, but even criminals are faced with choices and trade-offs. They have to invest their time, money and risk-tolerance wisely. This will undoubtedly lead to some criminals lowering their guard in the common assumption that “it wont be me that is caught”. It is the easy access to cryptography that makes it wide-spread in some criminal circles, if the effort to procure cryptographic tools increases, some will not go the extra mile to procure it. And for those that do, if they cannot effectively hide their use of illegal cryptography, it will paint a target on their back for law enforcement to employ more targeted (and expensive) methods.
Another consideration in this case is that the lack of a public market for cryptographic software will inevitably lead to less knowledge about which products are trustworthy and which are not. There will be less recognized experts looking at illegal products, so illegal products will be harder to trust. This will certainly be exploited by intelligence agencies that then will spread fear, doubt and uncertainty about some products, while trying to lure targets into products that have been outfitted with some kind of backdoor or weak algorithm.
It must therefor be concluded that no regulation actually targets the highly professional, well equipped, deeply educated and risk aware criminal. No previous laws have - notwithstanding public assurances of the contrary, which are just for soothing the general public. Otherwise our prison would be empty and judges out of work.
It is important to get this illusion of perfectly effective enforcement out of our heads, and the debate.
It remains the question of how to lower the availability of cryptographic tools to even begin with having any enforcement pressure.
The vast majority of all installations of cryptographic software is either shipped with the operating system (hard disc encryption) or delivered via App Stores (almost exclusively messenger software). The current majority usecase clearly points at smartphones and tablets, as well as netbooks. Sales of personal computers and laptops have been plummeting, except for gaming.
This puts the platform operators into a position of substantial control. While side-loading of applications is possible on most devices, it is inconvenient and not emplyed by the majority of users.
One approach then could be to enlist the support of these platform operators, that control both hardware, operating system and application delivery. A simple request could be to ban certain software from the App Stores. This has precedent, for example Apple banning VPN applications, and Google removing the Catalonian Referendum App because of a mere court order.
Such a ban on illegal cryptographic software does not need to be total, it is sufficient if vendors remove those applications on a case by case basis as prompted by law enforcement. If vendors do not comply, they could be held liable.
It is probably unfounded to believe that platform providers will actually stand up against government demand if they do not have public opinion strongly on their side. This should be concluded from their behavior towards China, Iran and India, where they cooperated with local government against the security and freedom interests of the population.
Do those corporation risk losing customers when they go along with regulation? Certainly they do, though the impact is hard to measure. If public opinion can be swayed in favor of regulation, the impact will be minimal. Especially since most consumer decisions will not be primarily informed by privacy issues, but rather by convenience, availability, network effects, and low risk (through certification). This should have become clear in the wake of the Snowden leaks - neither Google, nor Facebook, nor Apple lost substantial numbers of customers - even though everybody now knew that they participated in mass surveillance programs.
Unless public opinion firmly opposes regulation of cryptography, enforcement will be no major hindrance. And to accomplish public resistance, sound arguments are required.
Plausible regulation to mitigate the Going Dark Problem. A prediction.
After describing the motives for regulation and various technical approaches to implement it, we have to ask what actual means of regulation are realistic today. The survey of technologies has revealed that hard-handed approaches like those of the 1990s are hard to implement, and even harder to keep secure enough in a digital society.
Furthermore government is confronted with a much wider landscape of cryptography vendors and international stakeholders than before.
Various approaches can be ruled out:
-
Undermining algorithms and outlawing strong cryptography: Both lower the security of critical systems, and the general security of the population so far that the risks posed by cybercrime and cyberwar would become unacceptable.
-
Mandatory government backdoors: Again, the security impact here is out of control. Risking the computing and storage of whole nations to be subverted by criminals and foreign enemies cannot be justified.
-
Domestic or international key escrow, content escrow and key discovery schemes: The costs of implementation, maintenance, verification and certification would make this the biggest coordinated information technology project ever. The risks of project failure, insufficient security and stifled innovation are enormous. The required international coordination to prevent market fracture goes beyond what is possible in the current global political climate.
It is not clear if it can be ruled out that protocols would be undermined. While the risks posed by this approach are uncontrollable and many international technical standards would need modification, the repeated focus of some politicians on end-2-end encryption is concerning. Maybe this points at an actual attempt to persuade vendors to limit its use, or it is a position taken to shift the future compromise further into the field of the anti-crypto faction.
This apparently leaves government hacking and to convince software vendors to ship software with less secure default settings, especially to hide key ownership verification (to support man in the middle attacks) and to automatically backup communication logs and recovery keys to the cloud.
We think that this conclusion is a bit rushed and not in line with the (apparently) coordinated statements of politicians from various countries. The pressure generated by public opinion and law enforcement interest groups, and specific statements by politicians that they “just want frontdoor access” and “providers need to be able to provide plaintext” should give us pause and allow us to outline a few plausible additional regulatory steps.
Let’s first remember that the goal of regulation can only be to influence mass market availability and adoption of cryptographic tools that preserve confidentiality of content and communication relationships (metadata) against targeted government investigatory powers (1). That is where the going dark problem rests, and it is the minimal request by law enforcement. An extension of this goal would be to make users of strong, unregulated cryptography easier to identify and consider the use of these technologies as circumstantial evidence for criminal intend (as is today the possession of a “weapon” while committing any other crime).
Second, it seems that the problem with most regulatory approaches is that they create the centralization of control (escrow keys or access keys) in systems that are hard to build, maintain, secure and certify (2).
Third, international coordination of detailed regulation does not seem realistic in the current global political context (3).
Lastly, fourth, the technical context is currently dominated by few platform providers that control operating systems, application delivery and to some extend hardware (4).
Can there be a regulatory approach that recognizes these four points and incorporates them? We argue that five regulatory approaches are both realistic, and likely:
- Defense of metadata access: Convince vendors to refrain from creating systems that do not produce or retain metadata. This solves one part of the Going Dark problem and is relatively easy to argue for in the public sphere. While confidentiality of content is a long-cherished value in many societies, the anonymity of communication is widely faced with suspicion. Furthermore systems that suppress metadata are relatively hard to create, while metadata itself is of great interest to many vendors because it opens potential monetization strategies.
Various attempts in this direction have already been made, notably by the EU (data retention), UK (investigative powers act) and the USA (repeal of regulations that prevent metadata collection and use by providers). Furthermore metadata generation and retention are one of the core demands of Interpol and Europol reports on cybercrime.
- Nudge vendors to deliver software with less secure default settings: Many cryptographic tools can be weakened indirectly by exploiting human error. Most users are unaware of the necessity of verifying the identity of communication partners and the ownership of keys. This can be exploited by making key verification not a mandatory part of the user experience, but instead hide the feature, or refrain from implementing automatic protections. Intentionally or not, this could already be witnessed with Whatsapp, where verification of keys during key rotation was not performed, and the user not informed about the fact that keys had been rotated. This would in many cases be sufficient to trick users into communicating with a man-in-the-middle.
A similar approach could be taken when it comes to preserving communication histories or backups of local data to the cloud in a way that keeps the data accessible by the vendor. This could already be witnessed with Apple iMessage.
- Lawful hacking: Various countries, among them the USA, Germany and the Netherlands have made the exploitation of security vulnerabilities and infiltration of computer systems legal for law enforcement. Even in the wake of protests and legal ambiguities, this is quickly becoming a standard tool of police. A further strengthening of international cooperation in this field, especially in the sharing of exploit code and methods, would decrease costs and increase applicability, and potentially mitigate the problem of 0-Day exploits (secret vulnerabilities) by making 1-Day exploits (vulnerabilities known to vendors but not yet patched) practicable.
One issue in this regard does require more attention: International frameworks for cross-border lawful hacking are both necessary and so far non-existent. We can witness, however, that since the G-20 talks in 2017 there have been efforts to regulate this aspect. Various proposals for cross-border cooperation, digital evidence collection and legal process coordination have been made and are finding growing support especially in the group of EU, USA, UK, Canada. It should be expected that this tool will soon receive a multi-jurisdictional framework and standardization, which in turn will allow cooperation to increase effectiveness and efficiency. Lawful hacking most certainly is here to stay and spread.
- Use of update mechanisms to deliver police trojans: An extension of the lawful hacking approach is to use the system or application updaters to directly deliver government software to targeted devices. This is a very attractive method since it could potentially solve problems with exploit availability, targeting, security, and documentation that can be revealed in legal discovery. Updater software already exists, it already has the necessary authorization to install and modify running code, the delivery infrastructure exists, and pin-point targeting is available as well. Using updaters thus does not introduce new security or reliability problems while at the same time reaching the vast majority of devices.
However, vendors need to cooperate in this and must actively support law enforcement with each investigation. It would be too risky to weaken update security by giving police direct access to the infrastructure or required signature keys. Vendors in turn will refuse to cooperate if the process is not completely transparent and secured by legal safeguards - like the issuing of warrants, auditing, and notification of users after the fact. Furthermore the use of this method must be effectively limited to prevent accusations of mass surveillance. It might plausibly be enough to agree on limits on the number of undermined update events and publish statistics to sway public opinion - and thus vendor cooperation - to support this.
A variation of this approach is to not deliver police trojans, but to simply suppress updates for targeted devices. This could be used to extend the lifetime of security vulnerabilities that can be exploited by lawful hacking - after a device has been successfully infiltrated, patching of security vulnerabilities could again be allowed.
One warning should be added: Vendor cooperation to target specific devices leaks investigation details to the vendor. Law enforcement prefers to keep this secret.
- Mandate plaintext access: An elegant solution to the intricacies of key escrow and key recovery issues, at least from the point of view of regulators, is to refrain from defining any specific scheme on how to implement these. This gets around many of the complex details of certification, verification and international standard creation, as well as the impacts of regulation on innovation, single point of failure creation, and some security issues. Instead of mandating specific technical implementations, a result driven regulation would “only” mandate that vendors have to be able to make the plaintext of specific messages or device contents available on request.
Vendors would face fines in those cases where they are unable to deliver plaintext to law enforcement in specific cases.
This approach would of course put the complete technical burden on the shoulders of vendors, meaning that some vendors would be driven out of the market since they cannot provide adequate technical and organizational implementations to fulfill law enforcement requests. But it would also mean that no new entities have to be introduced that would present new points of failure or breach - leading to a more decentralized infrastructure.
To further enforce such a scheme, app-stores could be forced through court orders to remove applications that have failed to implement plaintext access as demonstrated by unsuccessful law enforcement requests - in the worst case automatic de-installation of those applications is technically feasible.
This approach of simply mandating plaintext access is attractive to law enforcement and politicians since it reduces the complexity of their parts significantly, hides the problematic details, and shifts all effort and liability to vendors. It looks good on paper. But it may not be forgotten that, while reducing the complexity of the overall implementations, the security problems of key escrow, content escrow and key recovery schemes still exist, even if on a smaller scale. Implementing such a regulation would undoubtedly lead to lower overall confidentiality for data at rest and in transit - not just versus law enforcement, but also unlawful access. While this might be partially mitigated by some platform providers that also have control over the hardware design, independent software-only vendors would still face a situation in which they will have to increase the risks their users face just to be able to cater to law enforcement.
Conclusion and advice.
We hope to have given some perspective into the technical, organizational and legal aspects of this new iteration of the Crypto Wars. We face a different situation today than the one faced in the 1990s.
The arguments usually parroted in media are not sufficient to make cryptographic regulation unattractive to politicians. Several possible routes of regulation exist, as well as approaches that do not require vendor regulation (for example, to support lawful hacking).
The risk here is that overcome arguments distract from those alternative routes instead of resisting them. The problem we face is much bigger than just cryptographic regulation - we are facing a change in the views and guarantees of confidentiality.
This means that we have to extend the debate to include these aspects:
-
Vendor neutrality: Should it be possible to force vendors of software and hardware, and not just communication providers, to provide law enforcement with extraordinary means of access?
-
Integrity of Information Processing Devices: Should we allow for provisions that undermine the integrity of - and the user’s control over - personal computers, smartphones and tablets? What are the ramifications of such provisions in light of legally binding digital contracts, liability, and the permissibility of digital evidence? Should information processing devices be considered extensions of the person that operates them, or do we consider them external artifacts that fall under public purview?
-
Freedom of Processing: Do users have the right to control what software runs on their devices? Do they have the freedom to install, remove, and develop whatever software they see fit? Do users actually own - and control - their devices?
-
The Right to Digital Self-Defense: Are individuals allowed to take steps to defend themselves against security risks in the digital sphere? Are they allowed to take best practice approaches to make themselves less vulnerable to cyber crime, and in extend contribute to making a nation less vulnerable to cyber war? Should we consider good security practice in the digital realm as part of civil defense?
These are the hard questions to ask, and they are closer to the public’s interest and domain of knowledge than many intricacies of cryptography. The answers to those question also have broader applicability - they inform future debates as well and thus can serve as precedent for finding new norms in social ethics.
Apart from these political and ethical aspects, the debate has also revealed potential weak spots in how we do computing these days. Especially the dominance of few platform providers, the vulnerability of update processes and lawful hacking should inform us to take technological action. A few suggestions in this regard:
-
Software delivery should be secured by some form of “Single Source of Record” that automatically verifies that a product delivered to a device does not deviated from installations on other devices.
-
Secure software development: The process by which software is developed needs to be taken more serious, especially for open source software. We need better review and auditing processes for security critical code, and greater isolation between security critical and non-critical modules. Furthermore, review and auditing processes deliver only limited protection to users if the build (compile) process cannot be verified. This means that verifiable, deterministic builds should become commonplace.
-
Platform vendors: The angle by which any regulation today is possible is through the high dependence of most users on platform providers that control both the operating system and the application delivery channels. It is this dependency that allows regulation to capture wide sections of the market by focusing all attention on a handful of corporations in very few jurisdictions. Both from public policy and security perspectives this is a risk-laden state of affairs that requires to be defused.
We hope that we could widen the perspective on the current debate with this text and warn against arguments that are not timely anymore. Furthermore we hope that we could give some hints to make the debate more fruitful, and suggest some areas of focus for engineers and developers in which they can help shape the environment of the debate towards more secure, and more freedom-preserving systems.