Computer encryption is based on the science of cryptography, which has been used as long as humans have wanted to keep information secret. In the last three decades, public-key cryptography has become a fundamental ingredient of our global digital infrastructure, supporting a plethora of applications that are important to our economy, our security, and our way of life, such as mobile phones, internet commerce, social networks, and cloud computing.
In a highly connected world, the ability of individuals, businesses and governments to communicate securely is of the utmost importance. Public-key encryption, digital signatures and key exchange are the heart and blood of our mission-critical communications. These services are primarily implemented using Diffie-Hellman key exchange, the RSA cryptosystem and elliptic curve cryptosystems. The algorithms beneath these cryptographic systems are based oh hard-to-solve mathematical problems, such as Integer Factorization or the Discrete Log Problem. The difficulty of these problems is the security of cryptography.
In recent years,there has been a substantial amount of research on quantum computers, machines that exploit quantum mechanical phenomena to solve mathematical problems that are difficult for conventional computers. If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. This would seriously compromise the confidentiality and integrity of our digital communications. The goal of post-quantum cryptography (also called quantum-resistant cryptography) is to develop cryptographic systems that are secure against both quantum and classical computers, and can inter-operate with existing communications protocols and networks.
Prepare for the Future of Cybersecurity: InfoSec's Guide to Post-Quantum Readiness
What is quantum computing?
Traditional computers work with binary digits—or bits for short—that are either zero or one. Typically, zero and one are represented by some traditional physical property such as an electronic capacitor that holds a charge or not.
Quantum computers on the other hand, work with quantum bits, referred to as qubits, which can essentially represent zero or one at the same time. In theory, that makes it possible to perform calculations in parallel that would normally require a loop to do them one at a time. The qubits are based on physics theories of quantum superposition and entanglement, which are still being put into practice. The idea, simply speaking, is that for some types of algorithm, a quantum computer can calculate in N units of time what would otherwise take 2N units of time to work out. In other words, some problems that are conventionally considered to be exponential time algorithms would turn into polynomial time algorithms.
To refresh our memory, while exponents involve “raising something to the power of X”, and exponential functions grow enormously quickly, polynomial algorithms involve “multiplying X by something”, and they are much more manageable than exponential ones.
As a result, many people are worried that if quantum computers really work as claimed and can be scaled up to have a lot more processing power and qubit memory than they do today, they could successfully solve problems that we consider as “computationally unfeasible”.
The most obvious example is cracking encryption
Encryption codes use “trapdoor” mathematical functions. Trapdoor functions are based on the process of multiplication, which is easy to perform in one direction but much harder to do in reverse. For example, even an elementary school student can multiply two numbers together: 593 times 829 is 491,597. But it is hard to start with the number 491,597 and work out which two prime numbers must be multiplied to produce it.
And it becomes increasingly difficult as the numbers get larger. Indeed, computer scientists consider it practically impossible for a classical computer to factor numbers that are longer than 2048 bits, which is the basis of the most commonly used form of RSA encryption. These encryption systems have never been unbreakable. Instead, their security is based on the huge amount of time it would take for a classical computer to do the job. Modern encryption methods are specifically designed so that decoding them would take so long they are practically unbreakable.
Is this the end of cryptography as we know it?
Fortunately, the answer is “No” because of a cat and a computer.
With quantum computers, even though you can do a whole load of calculations in parallel because the qubits are in multiple quantum states at the same time, you can only read out one of the valid answers, at which point all the other answers vanish. In other words, you can calculate and “store” multiple answers concurrently, but you can’t enumerate all the valid answers afterwards.
If you’ve heard of Erwin Schrödinger’s Cat, you’ll recognize this problem:
“Schrödinger’s Cat is a thought experiment in which a “quantum cat” hidden out of sight inside a box may simultaneously be both alive and dead, because quantum cats can be in both states at the same time, provided you aren’t looking at them. But as soon as you open the box to see this amazing creature in real life, it immediately adopts one of the possibilities—so opening the box may kill the cat instantly. You can’t figure out in advance if it’s safe—safe for the cat, that is—to open the box.”
This theoretical problem is actually based on the Heisenberg uncertainty principle, which states that the more precisely the position of some particle is determined, the less precisely its momentum can be known, and vice versa.
In 1994 Peter Shor developed a quantum algorithm (Shor’s algorithm) for integer factorization that runs in polynomial time, and therefore would be able to break any RSA or discrete log-based cryptosystem (including those using elliptic curves). This implies that all widely used public-key cryptography would be insecure if someone were to build a quantum computer.
The question of when a large-scale quantum computer will be built is complicated. While in the past it was uncertain whether a large quantum computers are physically possible, many scientists today think of it as an engineering challenge. Some experts even predict that within the next 20 or so years, sufficiently large quantum computers will be built to break essentially all public-key schemes currently in use.
In fact, this isn’t an easy quest. In 2012, physicists used a four-qubit quantum computer to factor 143, and in 2014 they used a similar device to factor 56,153. Somebody could say that at this rate of progress, quantum computers should soon be able to outperform the best classical ones.
But this is far from true. Quantum factoring is much harder in practice than expected. The reason is that noise becomes a significant problem for large quantum computers. And the best way currently to tackle noise is to use error-correcting codes that require significant extra qubits themselves. Taking this into account dramatically increases the resources required to factor 2048-bit numbers. In 2015, researchers estimated that a quantum computer would need a billion qubits to do the job reliably.
Four years fast-forward into 2019 and this estimation has been revised downwards. Graig Gidney at Google in Santa Barbara and Martin Ekerå at the KTH Royal Institute of Technology in Stockholm, Sweden have found a more efficient way for quantum computers to perform the code-breaking calculations, reducing the resources they require by orders of magnitude. The two researchers have demonstrated that a quantum computer could do the calculation with just 20 million qubits in just eight hours. “[As a result], the worst case estimate of how many qubits will be needed to factor 2048 bit RSA integers has dropped nearly two orders of magnitude,” they say.
Which algorithms are at risk?
Bruce Schneier says that news isn’t too bad for symmetric cryptography. “Grover's algorithm shows that a quantum computer speeds up these attacks to effectively halve the key length. This would mean that a 256-bit key is as strong against a quantum computer as a 128-bit key is against a conventional computer; both are secure for the foreseeable future.”
For public-key cryptography, the results are more dire, as it is depicted in the table below, extracted from the NIST IR 8105 publication. Shor's algorithm can easily break all of the commonly used public-key algorithms based on both factoring and the discrete logarithm problem.
Table 1: Impact of Quantum Computing on Common Cryptographic Algorithms. Source: NIST IR 8105
In other words, quantum computing can’t crack the AES encryption, but it doesn’t have to because it can crack the AES key instead, and then decrypt the AES data directly.
Is there another solution? Bruce Schneier puts is nicely: “Yes, I know that quantum key distribution is a potential replacement for public-key cryptography. But come on—does anyone expect a system that requires specialized communications hardware and cables to be useful for anything but niche applications? The future is mobile, always-on, embedded computing devices. Any security for those will necessarily be software only.”
What’s the path forward?
The only solution, therefore, is to make public-key cryptography quantum resistant. NIST is currently running a competition to design, analyse and choose a set of new algorithms for public-key cryptography. This contest is known as the PQC Standardization Challenge, where PQC stands for Post-Quantum-Cryptography.
The new public-key cryptography standards will specify one or more additional digital signature, public-key encryption, and key-establishment algorithms to augment FIPS 186-4, “Digital Signature Standard (DSS)”, as well as special publications NIST SP 800-56A Revision 3, “Recommendation for Pair-Wise Key Establishment Schemes Using Discrete Logarithm Cryptography”, and NIST SP 800-56B Revision 2, “Recommendation for Pair-Wise Key-Establishment Schemes Using Integer Factorization”. It is intended that these algorithms will be capable of protecting sensitive information well into the foreseeable future, including after the advent of quantum computers.
The process has been running since April 2016, when NIST started accepting proposals, and entered its first evaluation stage in November 2017, when NIST stopped accepting new algorithms for consideration. Of the 82 received submissions, NIST announced on 20 December 2017 the acceptance of 69 First-Round Candidates as meeting both the submission requirements and minimum acceptability criteria.
On 30 January 2019, the project went into Round 2, with NIST announcing that 26 out of the 69 first-round candidates were through to what it calls the ‘semifinals’. The proposals were evaluated based on three broad aspects of evaluation criteria, which are security, cost and performance, and algorithm and implementation characteristics. NIST expects the next stage of evaluation to take 12 to 18 months, after which there may be a third round, and then official standard algorithms will be picked.
The proposed post-quantum submissions are based on lattices, codes, multivariate polynomials, and hash-based signatures.
- Lattice-based cryptography is the basis of exciting new applications such as fully homomorphic encryption, code obfuscation and attribute-based encryption. Most lattice-based key establishment algorithms are relatively simple, efficient, and highly parallelizable. Lattice-based cryptography relies on the complexity of solving lattice problems in linear algebra (e.g. the shortest vector problem), which locate the nearest point in a lattice with many spatial dimensions and thus take far longer to solve than factoring prime-numbers that underlie current encryption.
- Code-based cryptography is based on the McEliece cryptosystem, which was first proposed in 1978, and has not been broken since. Since that time, other systems based on error-correcting codes have been proposed. While quite fast, most code-based primitives suffer from having very large key sizes.
- Multivariate polynomial cryptography relies on the difficulty of solving systems of multivariate polynomials over finite fields, and has historically been more successful as an approach to signatures.
- Hash-based signatures are digital signatures constructed using hash functions. Their security, even against quantum attacks, is well understood. Hash-based cryptography leverages pseudorandom hashing on a quantum-strength level to resist sophisticated attacks. However, hash-based signature schemes have the drawback that the signer must keep a record of the exact number of previously signed messages, and any error in this record will result in insecurity. Another drawback is that the increase in the number of signatures that can be produced, even to the point of being effectively unlimited, increases the signature size.
The process of creating quantum resistant cryptography is taking a long time because cryptanalysis is hard. Peer review, unbiased assessment and a transparent process to choose open standards can’t be rushed, not least because deciding that a cryptographic algorithm doesn’t have holes is effectively proving a negative.
It has taken almost 20 years
to deploy our modern public-key cryptography infrastructure. It will take significant effort to ensure a smooth and secure migration from the current widely used cryptosystems to their quantum computing resistant counterparts. Therefore, regardless of whether we can estimate the exact time of the arrival of the quantum computing era, we must begin now to prepare our information security systems to be able to resist quantum computing.