{"id":1139939,"date":"2025-05-27T09:00:00","date_gmt":"2025-05-27T16:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=1139939"},"modified":"2025-05-27T07:38:55","modified_gmt":"2025-05-27T14:38:55","slug":"frodokem-a-conservative-quantum-safe-cryptographic-algorithm","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/frodokem-a-conservative-quantum-safe-cryptographic-algorithm\/","title":{"rendered":"FrodoKEM: A conservative quantum-safe cryptographic algorithm"},"content":{"rendered":"\n
\"The<\/figure>\n\n\n\n

In this post, we describe FrodoKEM, a key encapsulation protocol that offers a simple design and provides strong security guarantees even in a future with powerful quantum computers.<\/p>\n\n\n\n

The quantum threat to cryptography<\/h2>\n\n\n\n

For decades, modern cryptography has relied on mathematical problems that are practically impossible for classical computers to solve without a secret key. Cryptosystems like RSA, Diffie-Hellman key-exchange, and elliptic curve-based schemes\u2014which rely on the hardness of the integer factorization and (elliptic curve) discrete logarithm problems\u2014secure communications on the internet, banking transactions, and even national security systems. However, the emergence of <\/strong>quantum computing poses a significant threat to these cryptographic schemes.<\/p>\n\n\n\n

Quantum computers leverage the principles of quantum mechanics to perform certain calculations exponentially faster than classical computers. Their ability to solve complex problems, such as simulating molecular interactions, optimizing large-scale systems, and accelerating machine learning, is expected to have profound and beneficial implications for fields ranging from chemistry and material science to artificial intelligence.<\/p>\n\n\n\n

At the same time, quantum computing is poised to disrupt cryptography. In particular, Shor\u2019s algorithm, a quantum algorithm developed in 1994, can efficiently factor large numbers and compute discrete logarithms\u2014the very problems that underpin the security of RSA, Diffie-Hellman, and elliptic curve cryptography. This means that once large-scale, fault-tolerant quantum computers become available, public-key protocols based on RSA, ECC, and Diffie-Hellman will become insecure, breaking a sizable portion of the cryptographic backbone of today\u2019s digital world. Recent advances in quantum computing, such as Microsoft\u2019s Majorana 1 (opens in new tab)<\/span><\/a>, the first quantum processor powered by topological qubits, represent major steps toward practical quantum computing and underscore the urgency of transitioning to quantum-resistant cryptographic systems.<\/p>\n\n\n\n

To address this looming security crisis, cryptographers and government agencies have been working on post-quantum cryptography (PQC)\u2014new cryptographic algorithms that can resist attacks from both classical and quantum computers.<\/p>\n\n\n\n

The NIST Post-Quantum Cryptography Standardization effort<\/h2>\n\n\n\n

In 2017, the U.S. National Institute of Standards and Technology (NIST) launched the Post-Quantum Cryptography Standardization project (opens in new tab)<\/span><\/a><\/strong> to evaluate and select cryptographic algorithms capable of withstanding quantum attacks. As part of this initiative, NIST sought proposals for two types of cryptographic primitives: key encapsulation mechanisms (KEMs)\u2014which enable two parties to securely derive a shared key to establish an encrypted connection, similar to traditional key exchange schemes\u2014and digital signature schemes.<\/p>\n\n\n\n

This initiative attracted submissions from cryptographers worldwide, and after multiple evaluation rounds, NIST selected CRYSTALS-Kyber, a KEM based on structured lattices, and standardized it as ML-KEM (opens in new tab)<\/span><\/a>. Additionally, NIST selected three digital signature schemes: CRYSTALS-Dilithium, now called ML-DSA; SPHINCS+<\/sup>, now called SLH-DSA; and Falcon, now called FN-DSA.<\/p>\n\n\n\n

While ML-KEM provides great overall security and efficiency, some governments and cryptographic researchers advocate for the inclusion and standardization of alternative algorithms that minimize reliance on algebraic structure. Reducing algebraic structure might prevent potential vulnerabilities and, hence, can be considered a more conservative design choice. One such algorithm is FrodoKEM<\/strong>.<\/p>\n\n\n\n

International standardization of post-quantum cryptography<\/h2>\n\n\n\n

Beyond NIST, other international standardization bodies have been actively working on quantum-resistant cryptographic solutions. The International Organization for Standardization (ISO)<\/strong> is leading a global effort to standardize additional PQC algorithms. Notably, European government agencies\u2014including Germany\u2019s BSI (opens in new tab)<\/span><\/a>, the Netherlands\u2019 NLNCSA and AIVD (opens in new tab)<\/span><\/a>, and France\u2019s ANSSI (opens in new tab)<\/span><\/a>\u2014have shown strong support for FrodoKEM, recognizing it as a conservative alternative to structured lattice-based schemes.<\/p>\n\n\n\n

As a result, FrodoKEM is undergoing standardization at ISO.<\/strong> Additionally, ISO is standardizing ML-KEM and a conservative code-based KEM called Classic McEliece. These three algorithms are planned for inclusion in ISO\/IEC 18033-2:2006 as Amendment 2 (opens in new tab)<\/span><\/a>.<\/p>\n\n\n\n

What is FrodoKEM?<\/h2>\n\n\n\n

FrodoKEM<\/a> is a key encapsulation mechanism (KEM) based on the Learning with Errors (LWE) problem<\/strong>, a cornerstone of lattice-based cryptography. Unlike structured lattice-based schemes such as ML-KEM, FrodoKEM is built on generic, unstructured lattices, i.e., it is based on the plain <\/em>LWE problem.<\/p>\n\n\n\n

Why unstructured lattices?<\/h3>\n\n\n\n

Structured lattice-based schemes introduce additional algebraic properties that could potentially be exploited in future cryptanalytic attacks. By using unstructured lattices, FrodoKEM eliminates these concerns, making it a safer choice in the long run, albeit at the cost of larger key sizes and lower efficiency.<\/p>\n\n\n\n

It is important to emphasize that no particular cryptanalytic weaknesses are currently known for recommended parameterizations of structured lattice schemes in comparison to plain LWE. However, our current understanding of the security of these schemes could potentially change in the future with cryptanalytic advances.<\/p>\n\n\n\n

Lattices and the Learning with Errors (LWE) problem<\/h2>\n\n\n\n

Lattice-based cryptography relies on the mathematical structure of lattices, which are regular arrangements of points in multidimensional space. A lattice is defined as the set of all integer linear combinations of a set of basis vectors. The difficulty of certain computational problems on lattices, such as the Shortest Vector Problem (SVP) and the Learning with Errors (LWE) problem, forms the basis of lattice-based schemes.<\/p>\n\n\n\n

The Learning with Errors (LWE) problem<\/h3>\n\n\n\n

The LWE problem is a fundamental hard problem in lattice-based cryptography. It involves solving a system of linear equations where some small random error has been added to each equation, making it extremely difficult to recover the original secret values. This added error ensures that the problem remains computationally infeasible, even for quantum computers. Figure 1 below illustrates the LWE problem, specifically, the search version of the problem.<\/p>\n\n\n\n

As can be seen in Figure 1, for the setup of the problem we need a dimension \\(n\\) that defines the size of matrices, a modulus \\(q\\) that defines the value range of the matrix coefficients, and a certain error distribution \\(\\chi\\) from which we sample \\(\\textit{\u201csmall\u201d}\\) matrices. We sample two matrices from \\(\\chi\\), a small matrix \\(\\text{s}\\) and an error matrix \\(\\text{e}\\) (for simplicity in the explanation, we assume that both have only one column); sample an \\(n \\times n\\) matrix \\(\\text{A}\\) uniformly at random; and compute \\(\\text{b} = \\text{A} \\times \\text{s} + \\text{e}\\). In the illustration, each matrix coefficient is represented by a colored square, and the \u201clegend of coefficients\u201d gives an idea of the size of the respective coefficients, e.g., orange squares represent the small coefficients of matrix \\(\\text{s}\\) (small relative to the modulus \\(q\\)). Finally, given \\(\\text{A}\\) and \\(\\text{b}\\), the search LWE problem consists in finding \\(\\text{s}\\). This problem is believed to be hard for suitably chosen parameters (e.g., for dimension \\(n\\) sufficiently large) and is used at the core of FrodoKEM.<\/p>\n\n\n\n

In comparison, the LWE variant used in ML-KEM\u2014called Module-LWE (M-LWE)\u2014has additional symmetries, adding mathematical structure that helps improve efficiency. In a setting similar to that of the search LWE problem above, the matrix \\(\\text{A}\\) can be represented by just a single row of coefficients.<\/p>\n\n\n\n

\"Visualization
FIGURE 1:<\/strong> Visualization of the (search) LWE problem.<\/figcaption><\/figure>\n\n\n\n

LWE is conjectured to be quantum-resistant, and FrodoKEM\u2019s security is directly tied to its hardness. In other words, cryptanalysts and quantum researchers have not been able to devise an efficient quantum algorithm capable of solving the LWE problem and, hence, FrodoKEM. In cryptography, absolute security can never be guaranteed; instead, confidence in a problem\u2019s hardness comes from extensive scrutiny and its resilience against attacks over time.<\/p>\n\n\n\n

How FrodoKEM Works<\/h2>\n\n\n\n

FrodoKEM follows the standard paradigm of a KEM, which consists of three main operations\u2014key generation, encapsulation, and decapsulation\u2014performed interactively between a sender and a recipient with the goal of establishing a shared secret key:<\/p>\n\n\n\n

    \n
  1. Key generation (KeyGen), computed by the recipient<\/strong>\n