At last year’s Machine Identity Management Summit we were lucky to have leaders from NIST share their vision of zero trust architectures and post-quantum cryptography. In light of the pending announcements from NIST on the standard for post-quantum cryptography, I thought it would be interesting to revisit their speech at the conference.
Within the National Cybersecurity Center of Excellence (NCCoE), which is a component of NIST, Alper Kerman, Principal Lead for Zero Trust Efforts, NIST NCCoE and Bill Newhouse, Cybersecurity Engineer, NIST NCCoE work with the industry collaboratively to build example cybersecurity solutions that use commercially available technologies contributed by participants on the project.
Here’s what these leaders presented at the Venafi Summit in 2023.
The zero trust architecture project was started in partnership with the Federal CIA Council back in October of 2018 when we held a zero trust workshop, which is the event that led to launching a year-long project, a zero trust research project at NIST, in early 2019. At the end of that project, NIST published the special publication 800-207, which is their zero trust architecture (ZTA) guidance.
As defined by NIST, zero trust assumes there is no implicit trust granted to assets or user accounts based solely on their physical or network location or based on asset ownership. Authentication and authorization are discrete functions performed before a session to an enterprise resource is established. Zero trust focus on protecting resources not network segments, as the network location is no longer seen as the prime component to the security posture of the resource. ZTA uses zero trust principles to plan industrial and enterprise infrastructure and workflows.
But their ZTA work didn’t stop there. NIST is constantly brainstorming future ZTA activities, such as strengthening ZTA by integrating with other NCCoE projects like the data classification project. Another area is the hardware supply chain where ZTA plays a great role in restricting access based on attributes. There's also a ZTA aspect to securing DevOps environments and NIST is already working on a project for software supply chain and DevOps security practices.
And then there's the migration to post quantum cryptography project, which is top of mind for many right now. One aspect of that project is to restrict access to components that have quantum vulnerable cryptography. In order to do so you need to know where these components are. ZTA can help there because it supports discovery, which helps create an inventory of where these cryptographic modules are located.
Essentially, NIST is trying to bring awareness to any of the issues involved with migrating to post quantum cryptographic algorithms. In the process, they are also trying to ease the migration from the current public key algorithms that are deemed vulnerable to a quantum computer based attack to new algorithms.
In the process, NIST has selected four algorithms it will standardize as a result of the Post-Quantum Cryptography (PQC) Standardization Process: CRYSTALS–KYBER, along with three digital signature schemes: CRYSTALS–Dilithium, FALCON, and SPHINCS+.
According to NIST, a quantum computer should be able to implement Shor's algorithm and break the mathematics of public key encryption, asymmetric encryption that we rely on. And as such, if we can break those if this new quantum computer when developed. It doesn't exist today, as best we know, but when it is developed, a lot of our asymmetric encryption or public encryption will be vulnerable.
So NIST started into this process in 2016 because they knew it would take a long time to identify the algorithms. But also, all cryptographic migrations take a while and this one will be the largest one we've ever attempted. Literally anybody who uses digital communications and wants to encrypt data or digital technologies that rely on cryptography will be impacted.
There are protocols that rely on public key encryption such as TLS and QUIC and, there’s also use of high hardware security modules, HSMs, in in cryptography. Those protocols have vulnerable algorithms within them and need to be updated. Adding the new algorithms as they become standard will be a first step in the standard organization.
A quantum cryptographically relevant quantum computer will break public key encryption. Quantum readiness is that you've identified where such a device could cause you problems. There's a there's a thought that people are storing and collecting data today to break later once that a cryptographically relevant quantum computer (CRQC) is available. NIST is showing organizations the techniques to put these new algorithms in place and use and also to know what's going on with the algorithm they’re using in their environment today.
But are we ready for such a big transition? It's pretty well believed that most organizations don't have a full grasp of all the cryptographic algorithms they rely on. There are a lot of challenges in adoption, implementation, deployment that this is not uncomplicated. In fact, it is very complicated. NIST’s work aims to help smooth that out for many.
