Back in December 2021, the UK and US governments announced plans to collaborate on bilateral innovation prize challenges focused on advancing privacy enhancing technologies (PETs). This presents an opportunity to harness the power of data in a manner that protects privacy and intellectual property, enabling cross-border and cross-sector collaboration to solve shared challenges, according to the joint statement.
“Privacy-enhancing technologies are a critical component of the suite of democracy-affirming capabilities that can support our shared democratic values in the face of authoritarian exploitation of emerging technologies” said Dr. Eric Lander, the President’s Science Advisor and Director of the White House Office of Science and Technology Policy.
I sat down with Dr. Kurt Rohloff, CTO at Duality, to discuss the implications of this partnership and to investigate how privacy-enhancing technologies like homomorphic encryption can help data operations in research, academia, banking and other sectors without harming our privacy. The conversation below has been edited for clarity.
TLS Machine Identity Management for Dummies
What do you think are the benefits, drawbacks of PET?
Dr. Kurt Rohloff: I think there are a myriad of benefits really. There's been a growth in regulation of sensitive data, all for good reasons. Whether it's financial records, personal location information, or medical records, this data is regulated. There are potential fines and other liabilities associated with sharing the data in an irresponsible manner. I think any reasonably intelligent person sees the benefit of that.
But one of the drawbacks of that is that this data, which is valuable and could really benefit the society, is difficult to share. For example, there's all kinds of financial crime. Banks would like to be able to share financial records to identify scammers that exploit individuals’ trust. However, the problem is that none of us want to share our financial information. Leveraging PET, banks can coordinate and share information about suspicious behavior and identify patterns and suspicious behaviors across countries or banks.
Do you think we could leverage PETs in the law enforcement sector?
Rohloff: As I mentioned before, PETs would allow coalitions of banks, for example, to identify and share information on behaviors that would indicate malicious intent or criminality. These banks could pool information about identified suspicious behavior and start building models of criminality.
Another example that is more common is related to terrorism. In this scenario, a bank would be able to terminate its relationship with any individual that has been identified as a suspect of terroristic behavior. The challenge with that is when a government or a bank starts to investigate a specific individual.
Privacy enhancing technologies offer a way for governments, investigators, and police forces to support investigations on terroristic behavior without implicating individuals. PETs do two things: 1) they protect the reputations of the people being investigated, and 2) they protect the subject of investigation.
(For background on debate over end-to-end encryption see the following Venafi blog post.)
Do you believe that the reason why we're discussing those privacy issues is also a matter of absence of policy?
Rohloff: There are international organizations that manage investigations of financial crime, like Interpol. The challenge associated with this is that there's a tension between privacy and data sharing. What's for the common good versus what's for the good of the individual. It is often unclear. Although democracies have their own preferences on what to share, the challenge comes when these countries want to cooperate to fight financial crime. How do they square what they want to share for the good of the citizens with the will of the citizens? Privacy regulations allow room for collaboration while sharing less raw data. In some sense, the adoption of privacy technologies is even more valuable because it raises the bar on what's actually shared or not shared overall.
What is the role of PETs and homomorphic encryption in balancing security, privacy and liberty?
Rohloff: Security, freedom of speech and privacy are all closely tied. They are very important things balanced against each other. I should be free to own my personal data. On the other hand, liberty is some sense of balance of what I can do versus what I want to do without harming the good of the whole. Homomorphic encryption does provide a degree of privacy for individuals, companies, and governments. It also allows the enablement of potentially higher degrees of security without violating privacy and maintains the liberty of individuals.
(Homomorphic encryption lets you take encrypted data, transfer it to where it needs to go, perform calculations on it, and get results—without ever knowing the exact underlying data. In other words, it allows computations on encrypted data without first decrypting it – so data can be encrypted and out-sourced for processing, all while encrypted.)
Rohloff: When homomorphic encryption was discovered back in 2009 – 2010, the first implementations took half an hour, and that was only for two bits. Over a course of four years, we have witnessed six orders of magnitude performance improvement, which is well beyond Moore’s law.
Homomorphic encryption (HE) provides a different computing model than the one used in Java and Python. HE is like field-programmable gate arrays (FPGA). HE is structured, where you run a lot of operations on vectors and matrices of data, highly structured numeric data. When you start running matrix vector operations, HE performs really well. When you start going outside of those domains, you start to have to make compromises, design tradeoffs. HE is really good for some things and not so good for others.
There will be applications for homomorphic encryption that will work well real time. There will be a lot of applications that'll work well in batch processing, taking large data sets and distilling machine learning. And there are some things where it doesn't make practical sense. There’s also a lot of things that we know how to do now, which are quite performant. There are a lot of things that will probably never be performant. The research path is taking things in that gray space that have value, bring that into things that we know how to do, and then commercializing that for real world high value applications.
How do we envision the future of homomorphic encryption versus quantum computing?
Rohloff: The biggest aspect of this is that the major homomorphic encryption schemes that exist right now are all known to be post-quantum safe or resistant to quantum computing attacks. The adoption of privacy-enhancing technologies is driven by a number of international efforts to adopt standard post-quantum crypto protocols. For example, NIST is driving an international competition to adopt international standards for post-quantum crypto. This has been great, because when we go and talk to potential customers, we get a lot of intertest because Duality is one of the few vendors of post-quantum crypto solutions.
Homomorphic encryption has an added value for privacy because it allows businesses to be future proof for the next 10, 15 years, wherever post-quantum crypto is required. This notion of quantum compute being on the horizon has driven a lot of adoption of homomorphic encryption, as compared to other privacy technologies, because it has these nice features associated with post quantum and beyond that. Overall, I do see quantum computing as a driver for the adoption of homomorphic encryption.
Dr. Kurt Rohloff also mentioned that standardization of homomorphic encryption by ISO is currently in draft status, and it would be another 2 to 4 years before formal adoption. If you wish to learn more about this promising privacy enhancing technology, visit https://homomorphicencryption.org/.
I would like to thank Dr. Kurt Rohloff for having this insightful conversation.
Why Do You Need a Control Plane for Machine Identities?
Related posts