Encryption Debate Ignited During 2019
Almost every week, we hear about another corporate data breach or government attack on privacy. For anyone who wants real privacy online, encryption is the essential component.
During 2019, the encryption debate ignited. Governments around the world keep trying to break encryption, seeking to enhance the power of their law enforcement agencies. They’ve tried for years to require companies to build backdoors into encrypted software and devices, which would enable them to listen in on potentially any digital conversation. The FBI has coined a phrase, “going dark,” that it has used since the late '90s to describe their “problem”—the lack of an omnipresent, all-powerful surveillance tool.
But encryption with special access for a select group is just broken encryption. The same security flaws used by U.S. police will be used by oppressive regimes and criminal syndicates.
Most innovation in 2019 has proven to be rhetorical, since anti-encryption authorities are determined not to call a backdoor a backdoor. For example, we saw a proposal from UK intelligence agency GCHQ to add “ghost” listeners to encrypted messaging applications.
The tradeoff between privacy and security
Modern criminals and terrorists hide amongst the patterns of innocent civilians, exactly mirroring daily life until the very last moments before becoming lethal, which can happen as quickly as a car turning onto a sidewalk or a man pulling out a knife on the street. As police intervention of an instantly lethal event is impossible, law enforcement has turned to prediction based on the surveillance of public and private data streams, facilitated by legislation like the Patriot Act, USA Freedom Act, and the UK's Counter-Terrorism Act. This legislation sparks a heated debate on the tradeoff between privacy and security.
Many scientists believe that this tradeoff between privacy and security is merely a technological limitation, which can be overcome with the use of Homomorphic Encryption. They believe that homomorphic encryption might eventually be the answer for organizations that need to process information while still protecting privacy and security.
TLS Machine Identity Management for Dummies
What is homomorphic encryption?
Homomorphic encryption (HE) makes it possible to analyze or manipulate encrypted data without revealing the data to anyone. Just like other forms of encryption, homomorphic encryption uses a public key to encrypt the data. Unlike other forms of encryption, it uses an algebraic system to allow certain functions to be performed on the data while it’s still encrypted. Only the individual with the matching private key can access the unencrypted data after the functions and manipulation are complete. This allows the data to be, and remain, secure and private even when someone is using it.
Another bonus of homomorphic encryption is that unlike other encryption models in use today, it is quantum resistant, meaning it is safe from getting broken by quantum computers. In fact, many proposed post-quantum cryptographic solutions are based on HE schemes.
There are three main types of homomorphic encryption:
- Partially homomorphic encryption, which keeps sensitive data secure by only allowing select mathematical functions to be performed on encrypted data;
- Somewhat homomorphic encryption, which supports limited operations that can be performed only a set number of times;
- Fully homomorphic encryption (FHE), which is the gold standard of homomorphic encryption that keeps information secure and accessible at all times.
Dr. Craig Gentry describes homomorphic encryption as a glovebox where anybody can get their hands into the glovebox and manipulate what's inside, but they are prevented from extracting anything from the glovebox. They can only take the raw materials and create something inside the box. When they finish, the person who has the key to the box can remove the materials—the processed data.
The biggest barrier to widescale adoption of homomorphic encryption is that it is still very slow—so slow it’s not yet practical to use for many applications. However, there are companies such as IBM and Microsoft, and researchers such as Dr. Gentry who are working diligently to speed up the process by decreasing the computational overhead that’s required for a fully homomorphic encryption.
Crime detection that preserves privacy
Homomorphic encryption has huge potential in areas where processing of sensitive personal data such as in financial services or healthcare is required and a person’s privacy is paramount. In these cases, homomorphic encryption can protect the sensitive details of the actual data, but still, this data can be analyzed and processed.
One of the reasons researchers are enthused about this is because very often the data sets they wish to study belong to separate organizations, each who has promised to protect the privacy and personal information of the data subjects. Fully homomorphic encryption provides the ability to generate aggregated reports about the comparisons between separate, fully encrypted, datasets without revealing the underlying raw data. This could prove revolutionary for fields such as medicine, scientific research and public policy.
For example, data sets can be compared to analyze whether people provided with homeless services end up in housing or holding jobs; whether student aid helps students succeed; or whether certain kinds of support can prevent people from being re-admitted to hospitals. These use cases all depend on comparing sensitive data held by different parties and subject to strict sharing protections. But homomorphic encryption will allow datasets to be encrypted, thereby protecting personal information from scrutiny, but still compared and analyzed to gain insights from aggregate level summary reports.
HE and law enforcement
Another area where HE can prove to be of great importance is crime detection and law enforcement. Currently, to detect events such as a murder or terrorist attack, law enforcement needs unrestricted access to data streams which might be predictive of the event. Thus, in order to detect an event that may occur in 0.0001% of the data, they must have access to 100% of the data stream. Therefore, there's a high degree of false positives in data collection, because 99.9999% of the data has nothing to do with an actual threat.
However, police forces must comply with the privacy law, such as GDPR or CCPA, and democratic principles designed to protect the rights of citizens and the privacy of their personal data. Wide-range scanning for limited organized crime signals is typically incompatible with legal constraints because it would unduly give power to the law enforcement and thereby limit personal freedom.
Such data privacy rules are even more stringent when it comes to the collaboration between investigators from different agencies and countries. Such is the case, for example, with the cross-border collaboration between law enforcement agencies in the European Union, which is regulated by an EU framework that was introduced in 2008 by EU Council Decision 2008/615/JHA and EU Council Decision 2008/616/JHA.
Because of such regulations and constraints, there is a conflict between safety and privacy and the fundamental paradox that arises is that “protecting citizens’ rights makes it more difficult to protect citizens’ rights.”
This paradox is destined to end. Recent advances in homomorphic encryption and hashing allow technologies like PhotoDNA to operate within a service with end-to-end encryption and perform image hashing on encrypted data. Images in encrypted messages can be checked against known harmful material without anyone else being able to decrypt the image. This analysis provides no information about an image’s contents, preserving privacy, unless it is a known image of child sexual abuse.
Using HE, law enforcement only has access to the predictions of the model as opposed to having access to the entire dataset. This is like the use of drug dogs in an airport. Drug dogs eliminate the need for law enforcement to search everyone's bags looking for cocaine. Instead, a dog is trained to exclusively detect the existence of narcotics. Barking = drugs. No barking = no drugs.
In crime detection, a positive prediction would mean "a terrorist plot is being planned on this phone", while a negative prediction would mean that "a terrorist plot is not being planned on this phone". Law enforcement has no need to see the data. They only need this one datapoint, the result of the predictive analysis.
Furthermore, as the model is a discrete piece of intelligence, it can be independently evaluated to ensure that it only detects what it's supposed to (just like we can independently audit what a drug dog is trained to detect by evaluating the dog's accuracy through tests). However, unlike drug dogs, using HE could provide us the ability to detect any crime through digital evidence.
Scientific research has proven the efficiency of HE in crime detection using various scenarios. Going back to the cross-border collaboration of law enforcement agencies, Anthony Barnett and other security researchers in various European universities and companies have proposed a solution based on homomorphic encryption and using “realistically-sized databases” which they “ran it on an average machine.” The performance of this HE-based solution “is well within the limit set by the EU Council Decisions and we thus obtain the first usable implementation of the Automated Detection of Organized Crime (ADOC) framework with an enhanced privacy model.”
Conclusion
Preserving privacy is a fundamental human right. Advances in homomorphic encryption will allow solving the equation of security vs privacy. It is in the interest of everyone—governments, law enforcement and technology companies—to invest heavily in the advancement of technologies such as homomorphic encryption, instead of wasting valuable resources in debating eternally on placing backdoors on current encryption schemes.
Venafi CEO Jeff Hudson explains what backdoors would mean for the security of some of our safest tech - from self-driving cars to nuclear reactors to the entire internet of things.
Find out why you need machine identity management
Related posts
- Battle of the Backdoors in Networking Infrastructure: Intentional vs. Incidental
- Going Undetected: How Cybercriminals, Hacktivists, and Nation States Misuse Digital Certificates
- 86% of IT Security Professionals Say the World Is in a Cyber War
- Venafi Survey: The Negative Impact of Government Mandated Encryption Backdoors