Apple, Google, Microsoft and others stated that a “ghost proposal” put forward by the Government Communications Headquarters (GCHQ) threatens fundamental human rights including privacy.
On 22 May 2019, a group of civil and society organizations including some of the world’s most well-known tech giants published an open letter in opposition to a proposal by the GCHQ to silently inject a law enforcement officer, or “ghost,” into encrypted chats. The coalition specifically stated that the proposal, if enacted, “would pose serious threats to cybersecurity and thereby also threaten fundamental human rights, including privacy and free expression.”
GCHQ first proposed the idea of a “ghost” in a Lawfare blog post entitled, “Principles for a More Informed Exceptional Access Debate.” Ian Levy, technical director of the National Cyber Security Centre (a part of GCHQ) and Crispin Robinson, technical director for cryptanalysis at GCHQ, wrote in the article that using a ghost would constitute a “better way” for enabling law enforcement to access services and devices protected by end-to-end encryption. Towards this end, the duo crafted six principles which they felt would help attenuate the concerns of both privacy advocates and digital security experts. These guidelines are as follows:
- Law enforcement would receive exceptional access as a ghost in only certain situations where there’s a legitimate need, where the collection of certain pieces of information could prove useful in an investigation, where access is the least intrusive means of obtaining that data and where there is appropriate legal authorization for listening in.
- Law enforcement and service providers should work together to understand technology’s evolution so that the former doesn’t waste time trying to reverse engineer new products.
- If the ghost proposal were to come into effect, law enforcement should work together with service providers and device manufacturers so that instances of exceptional access don’t change users’ trust in those services or devices. A key part of this objective, per the authors, involves recognizing security as an imperfect, non-binary construct.
- Service providers should work with law enforcement so that they themselves are involved in enacting every instance of exceptional access, thereby preventing governments from abusing such a solution as a backdoor into their citizens’ data.
- If enacted, the ghost proposal should not force service providers to do something that fundamentally changes the trust relationship between it and its users.
- Together, service providers and law enforcement should be transparent about any exceptional access solutions and should be willing to submit these implementations for expert analysis and testing.
In their open letter, Google, Microsoft and their co-signatories vocalized their support for these principles. But that didn’t stop them from rejecting the ghost proposal. Their reasoning is that accepting the proposal would require service providers to secretly inject a new public key into a conversation, thereby turning a dialogue into a group chat, and that the proposal might even require service providers to either change whatever encryption schemes they use or mislead users by disabling notifications of when the law enforcement agent joins their conversations.
The international coalition argued that such modifications would violate one of the key GCHQ principles for exceptional access solutions. As it explained in its letter:
“The GCHQ proponents of the ghost proposal argue that “[a]ny exceptional access solution should not fundamentally change the trust relationship between a service provider and its users. This means no tasking the provider to do something fundamentally different to things they already do to run their business.” However, the exceptional access mechanism that they describe in the same piece would have exactly the effect they say they wish to avoid: it would degrade user trust and require a provider to fundamentally change its service.”
The organizations went on to argue that enacting the ghost proposal would undermine users’ trust in the authentication methods used for their apps, introduce potential security vulnerabilities and empower law enforcement and/or malicious actors to potentially abuse the ghost function.
Google, Apple and the other signatories weren’t alone in their opposition to the letter, either. Susan Landau, Bridge Professor in the Fletcher School of Law and Diplomacy and the School of Engineering at Tufts University, wrote on Lawfare that it’s not clear what adding a silent listener would look like and that the current proposal doesn’t fully appreciate the sweeping changes it would require service providers to make to their communication infrastructure.
Indeed, ACLU Senior Technology Fellow Jon Callas clarified that every company providing secure communications and all governments wishing exception access would need to agree on a means of implementation and cooperate towards building that capability. This effort, as he notes, "is about embodying policy and that policy is international politics and this is often possible.
Then there was Bruce Schneier, a security technologist who argued in his own Lawfare post that the GCHQ proposal is unthinkable because all “exceptional access mechanisms… reduce the security of the underlying system” and thereby expose users to digital threats.
Broderick Perelli-Harris, senior director at Venafi, said something similar to NS Tech:
“Tech companies simply can’t grant access and to ‘cc’ a third recipient into communications, it will allow cyber criminals to undermine all types of private and secure communications. At this moment, citizens in the UK have basic rights to privacy. But if the government mandates backdoors that protection goes away.”
Perelli-Harris is right. The principles identified by GCHQ might be good starting points for further discussion about user privacy and law enforcement. However, they don’t justify a proposal that would increase users’ digital risk and affect the level of trust they have with service providers. These organizations have a duty to keep users’ and their data safe. Towards that end, they need to exercise extreme care with their encryption assets. That includes making sure that bad actors aren’t misusing their keys and certificates.
Why Do You Need a Control Plane for Machine Identities?
Related posts
- Venafi Survey: The Negative Impact of Government Mandated Encryption Backdoors
- Why are Government Officials Who Know Next to Nothing About Encryption So Eager to Mandate Encryption Backdoors?
- Why Governments Should Be Wary of Encryption Backdoors
- 86% of IT Security Professionals Say the World Is in a Cyber War