### Innovate. Accelerate. Win.

September 18-19 | Las Vegas and Virtual

**#MIMSummit2023** Join top security leaders looking to redefine what’s possible at the must-see industry event of 2023.

In cryptography, encryption and/or decryption of sensitive and classified information is achieved through the combined use of cryptographic algorithms and keys. Keys are characterized by their key size or key length, which is the number of bits in the key used in the cryptographic algorithm. NIST SP 800-57 Part 1, rev. 4defines a cryptographic key as “A parameter used in conjunction with a cryptographic algorithm that determines its operation in such a way that an entity with knowledge of the key can reproduce, reverse or verify the operation, while an entity without knowledge of the key cannot.”

Key length defines an algorithm's security, since it is “associated with the amount of work (that is, the number of operations) that is required to break a cryptographic algorithm or system.” Ideally, key length would coincide with the lower-bound on an algorithm's security and most symmetric-key algorithms are designed to have security equal to their key length. However, after design, a new attack might be discovered. For instance, Triple DES was designed to have a 168 bit key, but an attack of complexity 2112 is now known (i.e., Triple DES has 112 bits of security).

As long as the relation between key length and security is sufficient for a particular application, it doesn't matter if key length and security coincide. This is important for asymmetric-key algorithms, because no such algorithm is known to satisfy this property; elliptic curve cryptography comes the closest with an effective security of roughly half its key length.

It is therefore understood, that there isn’t a straight forward answer to the question “Can short keys leave your data vulnerable”. Acceptable key sizes in symmetric encryption are shorter than key sizes in asymmetric encryption, but elliptic curve encryption requires shorter keys than asymmetric. Let us delve into this topic a bit more

## Significance of keys

Keys are used to control the operation of a cipher so that only the correct key can convert encrypted text (ciphertext) to plaintext and vice versa. Many ciphers are actually based on publicly known algorithms and so it is only the difficulty of obtaining the key that determines security of the system, provided that there isn’t any structural weakness in the algorithms used, and assuming that the key is not otherwise available, such as via theft, extortion, or compromise of computer systems. The widely accepted notion that the security of the system should depend on the key alone has been explicitly formulated by Auguste Kerckhoffs (Kerckhoffs' principle) and Claude Shannon (Shannon's Maxim).

Therefore, a key should be large enough so that a brute-force attack is not feasible, meaning the attack would take too long to execute and the result would be of no use. Shannon's work on information theory showed that to achieve so called perfect secrecy, the key length must be at least as large as the message and only used once (One-time pad algorithm). Because of the practical difficulty of managing such long keys, modern cryptographic practice has discarded the notion of perfect secrecy as a requirement for encryption, and instead focuses on computational security, under which the computational requirements of breaking an encrypted text must not be feasible for an attacker.

## Key Size vs Encryption Systems

Encryption systems are often grouped into families. Common families include symmetric systems (such as AES or 3TDEA) and asymmetric systems (such as RSA or Diffie-Hellman). They may alternatively be grouped according to the central algorithm used (such as finite-field, integer-factorization and elliptic curve cryptography).

As each of these is of a different level of cryptographic complexity, it is usual to have different key sizes for the same level of security, depending upon the algorithm used. NIST defines two cryptographic algorithms as “of comparable strength for the given key sizes (X and Y) if the amount of work needed to “break the algorithms” or determine the keys (with the given key sizes and sufficient entropy) is approximately the same using a given resource.”

Given the above definition, NIST has published a table that gives the current estimates for the maximum security strengths that the approved symmetric and asymmetric cryptographic algorithms can provide for various specified key lengths. These estimates were made under the assumption that the keys used with those algorithms are generated and handled in accordance with specific rules. If these rules are not followed, the security provided may be less than the security strength estimates.

*Table 1: Comparable key strengths, adapted from NIST SP 800-57, pt.1, rev.4*

In the above table, the row highlighted in orange includes key sizes that are deprecated.

Why symmetric key sizes are shorter than asymmetric ones? It’s pure mathematics.

An n-bit symmetric key gives 2n different possible keys to choose from. If we assume there is no algorithmic structural weakness, we have to try all 2n of them to be certain of cracking the key, which is called brute-force attack. AES uses keys of 128, 192 or 256 bits in length, which means we will have to try 2128 or 2196 or even worse 2256 times to break it. Try to keep the numbers in your memory.

On the other hand, RSA, an asymmetric algorithm, is a public/private key cipher, meaning you have one key to lock and another key to unlock. You make the locking key public, so that anyone can encrypt messages and send them to you and you keep the corresponding unlocking key private, so that only you can decrypt them.

Technically and algorithmically, a 128-bit RSA key isn’t similar to a 128-bit AES key. It’s actually a 128-bit number n that is constructed by multiplying together p and q, two randomly-chosen 64-bit prime numbers. To crack such a key requires you to factorize n back into p and q. This sort of factorization is computationally complex, but it doesn’t take 2128 tries to guarantee a result. If fact, a 128-bit RSA key would be absurdly weak by modern standards. In practice, a 1024-bit RSA key is only considered equivalent to an 80-bit symmetric key, and thus has a security strength of 80, and that is the reason why AES-80 and RSA-1024 are both deprecated.

## Key Size vs Time

The actual degree of security achieved over time varies, as more computational power and more powerful mathematical analytic methods become available. For this reason cryptologists tend to look at indicators that an algorithm or key length shows signs of potential vulnerability, to move to longer key sizes or more difficult algorithms.

Advances in computer technology provided an indication back in 2007 that RSA-1024 encryption (which means RSA algorithm with 1024-bit key) would be breakable before 2010 and in 2015 the Logjam vulnerability proved that Diffie-Hellman algorithm with 1024-bit key is insufficient. Both have been deprecated.

## Here is where quantum computing comes into play.

In recent years, there has been a substantial amount of research on quantum computers, machines that exploit quantum mechanical phenomena to solve mathematical problems that are difficult for conventional computers. If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. This would seriously compromise the confidentiality and integrity of our digital communications. The goal of post-quantum cryptography (also called quantum-resistant cryptography) is to develop cryptographic systems that are secure against both quantum and classical computers, and can inter-operate with existing communications protocols and networks.

The two best known quantum computing attacks are based on Shor's algorithm and Grover's algorithm. Of the two, Shor's offers the greater risk to current security systems. Shor's algorithm can easily break all of the commonly used public-key algorithms based on both factoring and the discrete logarithm problem. If a suitably sized quantum computer capable of running Grover's algorithm reliably becomes available, it would reduce an AES 128-bit key down to 64-bit security, roughly a DES equivalent. This is one of the reasons why AES supports a 256-bit key length

The only solution, therefore, is to make public-key cryptography quantum resistant. NIST is currently running a competition to design, analyze and choose a set of new algorithms for public-key cryptography. This contest is known as the PQC Standardization Challenge, where PQC stands for Post-Quantum-Cryptography. Since January 2019, the process has entered Round 2, which is expected to last another 12 - 18 months. The process of creating quantum resistant cryptography is taking a long time because cryptanalysis is hard. Peer review, unbiased assessment and a transparent process to choose open standards can’t be rushed, not least because deciding that a cryptographic algorithm doesn’t have holes is effectively proving a negative.

It has taken almost 20 years to deploy our modern public-key cryptography infrastructure. It will take significant effort to ensure a smooth and secure migration from the current widely used cryptosystems to their quantum computing resistant counterparts. Therefore, regardless of whether we can estimate the exact time of the arrival of the quantum computing era, we must begin now to prepare our information security systems to be able to resist quantum computing

## Do Short Keys Leave You Vulnerable?

Revisiting our initial question and having in mind the analysis we made, it is obvious that this isn’t a yes/no answer. It depends on various factors, the confidentiality of the data to be protected, the environment where these data are in transit or stored, the duration of the requirement to protect our data and the advances of technology. NIST SP 800-57 pt.1 rev 4 publication provides a thorough background and criteria for selecting appropriate cryptographic algorithm and keys.

What is mostly important is the procedures we follow to safeguard our keys. How do we protect them from unauthorized disclosure, compromise of integrity, misuse, lack of user authentication. Do we use applications that automate the way we manage our keys?

**Related posts**

- How Private Are Your Private Keys: Can You Rely on Your Certificate Authority for Private Key Protection?
- Microsoft TLS Leak: Are Your Keys Exposed?
- ROCA Risks: Are Your Keys Safe?