Post-quantum Cryptography: Impacts, Algorithms, and Hybrid Approaches!

Topics in Cryptology CT-RSA 2018 (Springer Press)

After a week-long dose of non-stop security adventure, I am back from RSA Conference….and here is my quick dump on PQC!  “Post-quantum Cryptography (PQC) and strategies that resist quantum computer attacks on Public Key Cryptography” was one of the hottest topics discussed in Cryptographer’s panel and almost all cryptography panels and sessions – Not surprised at all!  While we do know there is no commercial availability of quantum computers exists, the current research activities at D-Wave Systems, Google, Microsoft and IBM shows that the infinitely large space of superposition states during computation and the entanglement of qubits allows quantum computers with an inherent advantage of handling exponential parallel computations resulting capabilities of handling millions and billions of processes or more done parallel at once.

Impacts of Quantum Computing

With quantum computing, the impact of Grover’s Algorithm and Shor’s Algorithm on the strength of existing Cryptographic schemes makes it more interesting.

Grover’s Algorithm gives a square-root speedup on key searching and can potentially brute-force algorithms with every possible key and break it. According to “Applying Grover’s Algorithm to AES: Quantum Resource Estimates“[1], this means that a brute-force attack on a symmetric cryptographic algorithm like AES128 requiring 2^256  AES-operations on traditional computers can be compromised with about 2^64 operations on a quantum computer.  In reality, Grover’s algorithm makes symmetric schemes vulnerable by a square-root factor reducing the effective key size and requires us to use doubling or even bigger key sizes that can help us resistant to Grover’s algorithm. This applies to hash algorithms (ex. SHA-256) where searching for preimages by a square-root factor (done with 2^128 operations) and search for collisions by a cube-root factor (done by 2^85.3 operations).  Any computationally hard algorithms in a traditional computer become less hard on a quantum computer because the security parameters like Key size chosen. Consequently, these security parameters on such algorithms must be constantly updated as quantum computing evolves.

Shor’s Algorithm efficiently solves integer factorizations and discrete logarithms in polynomial time on a quantum computer, which leads to easily breaking asymmetric cryptographic schemes like RSA and ECC. Interestingly, it cannot be mitigated by increasing the Key size or other parameters of the asymmetric algorithms.

To the worst, any adversary who sniffs on the Internet traffic and records secure communications would be able to easily decrypt the recording using a quantum computer when it is available. We do know from several pieces of evidence that many governments are already recording and storing encrypted Internet traffic for data mining purposes (I may be wrong). This year, CT-RSA 2018 (Cryptographer’s Track at RSA 2018) sessions were overwhelmingly dominated by PQC algorithms.

Post-Quantum Cryptography Algorithms 

We do have several PQC algorithms already available (under early implementation or testing) that are known to resist attacks by quantum computers and with efficient cryptographic schemes (in terms of smaller key sizes and computation workload) it can be potentially used to replace existing asymmetric algorithms used in public-key cryptography

  • Lattice-based cryptography – Proposed by M.Ajtai 1996[2], one of the early cryptographic schemes relied on the hardness of computational lattice problems, which led to the creation of NTRU Public key encryption scheme [3] based algebraically structured lattices.  In 2005, Regev[43] introduced the Learning With Errors (LWE) problem (based on Lattice problem) which serves as the basis for a variety of public-key encryption and signature schemes. Following LWE, in 2010, Lyubashevsky, Peikert, and Regev introduced the Ring-Learning With Errors (Ring-LWE) which used an additional structure that allows for smaller key sizes. An NTRU implementation is available and it can be used with multiple commercial-grade crypto implementations (ex. OpenSSL, wolfSSL, BouncyCastle, and few others).
  • Multivariate cryptography – Based on the difficulty of solving non-linear usually quadratic, polynomial over a finite field. The hardness of the system depends on the size of the finite field, variables and the degree of the system. For building asymmetric public key system, the public key is a set of multivariate quadratic polynomials and the private key is the knowledge of a trapdoor that allows solving the multi-variate system. Based on random multivariate systems can also be used for pseudo-random-number generators, cryptographic hash functions, and symmetric encryption.
  • Code-based cryptography – McEliece public key encryption that uses error correcting codes to hide contents of a message during transmission on an unreliable channel. The message sender deliberately adds an error in order to protect the contents of a message against an eavesdropper. The public key of the receiver is a generator matrix of receiver’s code. the sender encrypts the message by converting into a code word and by adding a secret error vector. The receiver decodes the corrupted codeword and obtains the message. McEliece was using binary Goppa codes and requires key sizes of about 4 Mb in order to assure quantum-resistant security.  Niederreiter proposed alternative approach instead introducing an error to the code word before transmission, it encodes the plaintext as error – a bit string and instead of using a generator matrix, a parity-check matrix is used as a public-key. The sender encodes the plain text as a bit string with weight w and computes the syndrome. The receiver uses a syndrome-decoding algorithm to recover the original error vector. In addition to public-key cryptosystems, Niederreiter also proposed signature schemes, hash functions, and random-number generators.
  • Hash-based cryptography – consider more mature, safer from impacts of Grover’s algorithm and reliable for construction of PQC schemes. Lamport, Diffie, and Winternitz demonstrated how to convert Merkle’s one-time signature scheme into a many-time signature scheme. Hash-based signature scheme is currently in the IETF standardization. Although there is a security issue with statefulness requiring the re-usage of private key material and during backups (data loss). The new variants SPHINCS and XMSS are considered quantum-resistant, which allows stateless schemes with larger signature sizes.
  • Supersingular elliptic-curve Isogeny cryptography – Newest addition to PQC, using difficulty in finding isogenies between supersingular elliptic curves. They have a similar structure to classical Diffie-Hellman and ECDH approaches.

Hybrid approaches

We do know it is premature to use PQC algorithms as it is not meaningfully tested and verified by global cryptographic community like traditional public-key and symmetric key algorithms. That said, we can always consider hybrid approaches that can fulfill the current state of security requirements and potentially assure quantum-resistance in the future. NIST does approve hybrid approaches (Hybrid ciphersuites involving one traditional public-key algorithm and one PQC algorithm) which allows to retain the security of traditional algorithms and can also meet quantum-safe requirements. NIST recently approved a digital signature scheme solution that allows signing a message twice – first with PQC algorithm based on stateful Hash-based Signatures and then using signature scheme validated by NIST. For Key exchange, both parties would establish two shared secrets, one using NIST approved current scheme (ECDH) and second with a PQC scheme like NewHope (based on Lattice). This means, in a typical TLS scenario, the TLS handshake uses two key exchange algorithms like ECDH (Traditional) and NewHope (PQC).

A hybrid approach with NewHope Algorithm – https://security.googleblog.com/2016/07/experimenting-with-post-quantum.html

Recently, Google demonstrated a hybrid approach for TLS between Chrome browser and Google servers using two key exchange algorithms including standard ECDH and NewHope (a Lattice algorithm).

PQC approaches are fast evolving in terms of design, cryptanalysis, and implementation, sooner or later we should able to adopt then without making major changes to the applications.  Let’s stay tuned!

 

HSM based Hybrid Approach

Currently no HSM providers claim to support PQC ciphers. As I noted from one of the engineers from HSM vendor (Ultimaco) who claimed that using Crypto software kits that supports HSM can be used as part of the hybrid approach where a PQC based key exchange like NewHope can be used to establish a secret symmetric key to encrypt communication between the HSM and the application. Combine PQC based digital signature and HSM facilitated signature algorithm to sign hashes.  Microsoft recently introduced an implementation of PICNIC signature scheme for quantum-safe digital signature operations that build on a zero-knowledge proof system (based on multi-party computation protocol) and using primitives like hash functions, and block ciphers. Microsoft released a reference implementation of PICNIC on github (which can be used on select HSMs like Ultimaco). Recently Microsoft also submitted PICNIC as a candidate for NIST PQC signature scheme.

References:
  1. Topics in Cryptology – CT-RSA 2018 (The Cryptographer’s Track at the RSA Conference 2018) Springer Press.
  2. M.Grassl, B.Langenberg, M.Roetteer and R.Steinwandt – Applying Grover’s Algorithm to AES: Quantum Resource Estimates – PQCrypto 2016
  3. M. Ajtai – Generating hard instances of lattice problems (ACM)
  4. M Waidner, R. Niederhagen, T.Grotker, P. Reinelt – Post-Quantum Crypto
  5. D.Stebila and M. Mosca – Post-Quantum Key Exchange for the Internet and the Open Quantum Safe Project
  6. Micorosoft PICNIC reference implementation – https://github.com/Microsoft/Picnic.

Leave a Reply

Your email address will not be published. Required fields are marked *