Anyone that wants to make sure that their data is protected for longer than 10 years should move to alternate forms of encryption now.
Arvind Krishna, Director of IBM Research (currently the CEO of IBM)
In March 2018, Google researchers presented a 72-qubit quantum processor called Bristlecone. Its exact specifications aren’t known yet. Apparently, they were far from ideal because Google developers have officially announced their quantum supremacy only in October 2019, after the release of a 53-qubit computer called Sycamore.
Quantum supremacy is the goal of demonstrating that a programmable quantum device can solve a problem that no classical computer can solve in any feasible amount of time.
In 200 seconds, Sycamore has got the result that a classical supercomputer would get in 10 thousand years. Intel contested the computer superiority of Google; furthermore, it turned out that the task was artificially designed for the result. Still, it became clear that 50+ qubits is a threshold beyond which a quantum computer can ‘beat’ a classical one (although in a narrow range of specific tasks).
At the end of 2020, Chinese researchers at the University of Science and Technology of China in Hefei have announced that their quantum computer has solved a problem that the best existing supercomputers might never be able to solve (to be specific, it would take half the age of Earth).
Concurrently, IBM researchers announced their plans to build a 127-qubit quantum computer in 2021, a 433-qubit computer in 2022, and 1121-qubit computer in 2023 with the prospect of reaching a million qubits or even more.
The problem, however, is not limited to the number of qubits. When quantum gates operate (i.e. perform unitary operations with qubits), errors inevitably occur. Their probability is low, but not null. Errors accumulate in the course of computation, and it’s difficult to obtain the correct result without a correction. Environmental impacts cause system decoherence (i.e. destruction of its quantum state that is the state of entanglement of qubits in an isolated system).
The decoherence time (i.e. the time a system of qubits remains isolated from the environment) is a critical factor for the creation of a large quantum computer. If this time is less than the time required for quantum computations, errors occur in the system, and the result of such computations is just randomized noise. Such research topics as system isolation, reduction of the level of errors on quantum gates, and creation of effective error correction algorithms have become priority directions nowadays and are actively pursued in leading research centers.
To correct errors, quantum algorithms use logical qubits consisting of several physical qubits. The state of a single qubit is encoded in entangled states of several qubits. Therefore, a quantum computer with the error correction function requires much more real qubits. For instance, Steane’s 7-qubit quantum code, one of the first proposed error correction algorithms, uses seven qubits to store one unit (qubit) of quantum information.
Quantum algorithms are developed concurrently with physical implementations of quantum computers. Popular science brochures describing the quantum programming basics are available in stores. Special mention must be made of the IBM project Quantum Experience that makes it possible for everyone to create quantum algorithms and run them on IBM quantum computers and simulators.
If you are interested in this topic, I strongly recommend a landmark book by Michael A. Nielsen and Isaac L. Chuang entitled “Quantum Computation and Quantum Information.”
What has it to do with cryptography?
It’s rational to use quantum computers for solution of a very small class of problems, namely quantum system modeling. For instance, in chemistry, a quantum computer can be used to compute chemical reactions or the spatial molecular structure of a new material or drug. In logistics, it can be used to solve optimization problems (e.g. the travelling salesman problem). Another field of application for quantum computers is cryptoanalysis of existing ciphers.
In 1994, Shor’s algorithm that uses the fast quantum Fourier transform to factorize large numbers was published. Its publication has fueled interest in quantum computers, intensified research studies in this area, and resulted in the allocation of enormous resources to the development of such devices.
US-based ASC X9 Inc. develops global financial standards, including cryptographic protection of financial data. Its members are Bank of America, Citigroup, JPMorgan Chase, the Federal Reserve, VISA, and the US National Security Agency. In January 2018, this organization raised concerns about risks posed by quantum computers to financial institutions. At the beginning of 2019, ASC X9 Inc. released an information report entitled ASC X9 IR 01-2019. The main finding of this report is as follows: if a quantum computer capable of breaking the security of bank cards designed for 5 years of use is expected to be created by 2028, then the use of classical cryptography should be discontinued, and protection has to be transferred to quantum-resistant cryptography at least 5 years prior to a potential attack. Taking into account the time required for the transition from the classical to post-quantum cryptography, this transition must be commenced immediately.
With reservations and with no definitive conclusion, the information report estimates the annual increase in the number of qubits at 35%. Not Moore’s law, but still… At that rate, a 1000-qubit quantum computer could be built in 7-8 years. But in fact, the available data on this matter are insufficient for a reasonable forecast.
It must be noted that the US National Security Agency has changed recommendations for cipher suites used to protect information. Curve P-256 for ECDH and ECDSA has disappeared from the so-called Suite B (currently CNSA Suite), although it still includes an RSA 3072-bit modulus.
Interestingly, it was always believed that P-256 and RSA-3072 are equal in terms of cryptographic strength and are on par with the symmetric AES-128 algorithm.
But the updated Suite B doesn’t include AES-128 (even for the SECRET level), only AES-256 for TOP SECRET. The FAQ issued by the NSA and Central Security Service (CSS) of the US Department of Defense clearly states that the following algorithms previously approved for SECRET must not be used in National Security Systems (NSS) anymore:
- ECDH and ECDSA with NIST P-256;
- RSA with 2048-bit keys; and
- Diffie-Hellman with 2048-bit keys.
As a result, the CNSA Suite currently includes the following algorithms (until quantum-resistant ones are developed).
Combined with optimistic predictions about the development of quantum computers, this information allows to propose a plausible version. According to an article published by Microsoft Research in 2017, 2330 qubits are sufficient to compute the discrete logarithm on the standard P-256 elliptic curve in a finite field; while the task of factoring an RSA 3072-bit module requires 6146 qubits (i.e. almost 3 times more).
In other words, in terms of quantum computing, P-256 and RSA-3072 are not equal in cryptographic strength. The NSA could obtain similar data earlier, but there also can be a more efficient optimization of Shor’s algorithm for elliptic curves.
If a quantum computer with 2300+ qubits is built within 10 years, it should be able to crack ECC short keys (starting with 256-bit keys) using Shor’s algorithm and weaken symmetric AES keys approximately by half using Grover’s quantum search algorithm. This means that the cryptographic strength of AES-128 will drop to 2^64 operations, which might become unacceptable in the foreseeable future. By contrast, the cryptographic strength of AES-256 will decrease to 2^128 operations, which will be still sufficient: today, researchers consider it impossible to perform 2^128 operations.
According to the article Applying Grover’s Algorithm to AES: Quantum Resource Estimates published in 2015, 2953 qubits are required to break AES-128.
However, in 2019, researchers at Florida Atlantic University have optimized the quantum algorithm and found out that mere 865 qubits are sufficient to crack AES-128.
It seems however that even a 1000-qubit quantum computer won’t be able to perform such an attack due the enormous amount of computations required for it. And it’s understandable why the NSA continues using RSA-3072: this key length will remain strong for a considerable period of time.
But the most exciting and intriguing statement in the ASC X9 report is as follows. At the time of publication, the largest quantum computer had 72 qubits. But authors refer to an opinion that more powerful quantum computers already exist, although information about them is classified. They even don’t rule out the existence of 100-qubit quantum computers!
Note that the authors mention not one 100-qubit quantum computer, but at least two such devices.
It’s not surprising therefore that the National Institute of Standards and Technology (NIST) of the US Department of Commerce initiated a process to solicit, evaluate, and standardize one or more quantum-resistant public-key cryptographic algorithms. Two rounds have already passed, and 7 main algorithms and 8 alternative ones were selected for participation in the third round
The Round 3 results are expected to be announced by mid-2022. One or two key exchange and digital signature algorithms can be approved as US standards. It cannot be ruled out that Round 4 would be required as well to select algorithms for application in specific areas. In the meantime, work is underway: seminars are held, the submitted algorithms are refined, and attacks on the algorithms are developed and improved.
Along with the algorithms, apps for their implementation are developed as well. A project entitled Open Quantum Safe (OQS) is actively developing. Within its framework, the liboqs library has been created; it includes a number of post-quantum algorithms shown in the diagram below. The library is integrated into OpenSSL and some other apps.
Microsoft has launched its own project, Post-quantum Cryptography, and develops such algorithms as FrodoKEM, SIKE, Picnic, and qTESLA, as well as Post-Quantum Crypto VPN, Post-Quantum TLS, and Post-Quantum SSH apps.
Today, everyone can download and install post-quantum OpenVPN. So far, only for experimental and testing purposes since the application hasn’t passed a strict audit yet and may contain bugs. In addition, the algorithms aren’t officially certified; so, you do this at your own peril and risk.
In Linux, you can unpack the archive right into the root directory (never do so on your combat PC, but for a VM, it’s OK):
# mv pq-openvpn-linux-staged.tar.gz /# tar -xvf pq-openvpn-linux-staged.tar.gz
The unpacked archive will place the binary to
/ and the
pq-openvpn. script to
All settings in post-quantum OpenVPN are similar to those in regular OpenVPN; so, I am not going to describe them in detail here. It’s necessary to note though that you have to set the
ecdh-curve directive in the server and client configuration files; otherwise, the exchange of keys would be classical, not quantum. The default variant is
ecdh-curve (a hybrid first-level cryptographic algorithm), but you still have to specify it.
According to the microsoft/PQCrypto-VPN page on GitHub, only Frodo and SIDH algorithms are supported. This is true, but on Windows 10, my client with the
ecdh-curve directive in the config throws a memory read error message. Of course, this may be due to my specific OS configuration, drivers, or motherboard features. By contrast, the connection goes smoothly with the algorithms
frodo640shake, and other algorithms of this family, including hybrid ones. In my experiments, I tried different algorithms for key exchange, including the hybrid algorithms
p521_ntru_hps4096821 not officially declared by Microsoft, and they worked fine (as well as pure post-quantum algorithms without the classical curve).
Keep in mind that in this PQCrypto-VPN implementation, post-quantum algorithms work only for TLSv1.3; therefore, you have to specify
tls-version-min in the client and server configs.
According to the developer, after the connection, there is only one way to make sure that the key exchange was performed using the post-quantum algorithm: you have to find
group_id in the server log:
Control Channel: TLSv1.3, cipher TLSv1.3 TLS_CHACHA20_POLY1305_SHA256, group_id 1278 (post-quantum key exchange)
In the server log screenshot, the string containing
group_id is underlined. It indicates that the post-quantum key exchange algorithm was used (otherwise it would be NOT post-quantum key exchange). In this particular case,
group_id 1278 (OpenSSL NID) relates to the key exchange algorithm specified in the
ecdh-curve directive in the OpenVPN config. In other words, this is a hybrid algorithm consisting of the classical Diffie-Hellman key exchange on the P-521 elliptic curve and the post-quantum FireSaber-KEM key encapsulation mechanism.
FireSaber-KEM corresponds to the fifth post-quantum security level. It’s based on the complexity of a modular learning with rounding (MLWR) problem that represents a variant of the learning with errors (LWR) problem. LWR, in turn, originates from lattice-based cryptography. Rumor has it that NIST prefers lattice-based cryptography to all others.
A few words about certificate-based authentication on the server. It can use different options: classical (i.e. not post-quantum) signature algorithms, or post-quantum signature schemes, or hybrid ones. The Picnic and qTESLA algorithms are declared on the project web page. But you can also use other algorithms included in the post-quantum OpenSSL fork (QQS-OpenSSL).
Now I am going to explain how to install QQS-OpenSSL on Debian 10.9.0.
First of all, you have to install dependencies:
# apt install cmake gcc libtool libssl-dev make ninja-build git python3-pytest python3-pytest-xdist unzip xsltproc doxygen graphviz
Then clone the QQS-OpenSSL code:
$ git clone --branch OQS-OpenSSL_1_1_1-stable https://github.com/open-quantum-safe/openssl.git OQS-OpenSSL
Clone and install the liboqs library (the fourth string specifies the directory with QQS-OpenSSL):
$ git clone --branch main https://github.com/open-quantum-safe/liboqs.git$ cd liboqs$ mkdir build && cd build$ cmake -GNinja -DCMAKE_INSTALL_PREFIX=/path/to/OQS-OpenSSL/oqs$ ninja# ninja install
Switch to the QQS-OpenSSL folder and run the following commands:
$ ./Configure no-shared linux-x86_64 -lm$ make
That’s it. Now you can generate post-quantum keys and issue certificates.
If you need signature algorithms not included by default, change
enable for that algorithm in the file
oqs-template/ prior to building OQS-OpenSSL and then execute the two commands:
$ python3 oqs-template/generate.py$ make generate_crypto_objects
Let’s try picnic_L1_UR as a signature algorithm: it uses the Unruh transform, which is supposed to be stable in a quantum random oracle model.
Quantum random oracle is a scheme where the adversary can make quantum (in superposition) queries to a random oracle. It’s different from the more common classical Fiat-Shamir transform (FS), used in picnic-L1-FS and similar algorithms.
Generally speaking, Picnic is an algorithm based on a zero-knowledge proof (ZKP) cryptographic protocol. The zk-SNARK protocol belonging to the same ZKP family is used in the Zcash cryptocurrency positioned by its developers as the most anonymous one (neither the transaction amount, nor the sender, nor the recipient can be identified).
Time to generate keys using the picnicl1ur algorithm and issue certificates. For convenience purposes, I create the
CERTFICATES subfolder in the
OQS-OpenSSL directory and execute standard OpenSSL commands changing only the key algorithm:
$ apps/openssl genpkey -algorithm picnicl1ur -out CERTIFICATES/picnic_ca.key$ apps/openssl req -x509 -key CERTIFICATES/picnic_ca.key -out CERTIFICATES/picnic_ca.crt -subj "/CN=Picnic CA" -days 100 -sha512 -config apps/openssl.cnf
Reviewing the certificate:
$ apps/openssl x509 -noout -text -in CERTIFICATES/picnic_ca.crt
Version: 3 (0x2)
Signature Algorithm: picnicl1ur
Issuer: CN = Picnic CA
Not Before: Apr 12 05:48:21 2021 GMT
Not After : Jul 21 05:48:21 2021 GMT
Subject: CN = Picnic CA
Subject Public Key Info:
Public Key Algorithm: picnicl1ur
X509v3 Subject Key Identifier:
X509v3 Authority Key Identifier:
X509v3 Basic Constraints: criticalCA:TRUESignature Algorithm: picnicl1ur21:82:09:59:16:85:a6:68:52:08:68:a6:02:99:aa:a9:90:14:
The next step is to generate keys and issue certificates to the server and clients. For useful certificate extensions, I create an
extensions file whose content is something like this:
[server]basicConstraints = CA:FALSEkeyUsage = digitalSignature, keyEnciphermentextendedKeyUsage = serverAuthsubjectAltName = @alt_names[alt_names]DNS = 192.168.3.5[client]basicConstraints = CA:FALSEkeyUsage = digitalSignature, keyEnciphermentextendedKeyUsage = clientAuth
Then I execute commands for the server:
$ apps/openssl req -new -newkey picnicl1ur -keyout CERTIFICATES/server.key -out CERTIFICATES/server.csr -nodes -subj "/CN=mysite.ru" -config apps/openssl.cnf -sha512$ apps/openssl x509 -req -in CERTIFICATES/server.csr -out CERTIFICATES/server.crt -CA CERTIFICATES/picnic_ca.crt -days 50 -CAkey CERTIFICATES/picnic_ca.key -CAcreateserial -sha512 -extensions server -extfile extensions
The client’s certificate can be issued in a similar way, but you have to type
client instead of
The picnicl1ur algorithm doesn’t support pre-built PQCrypto-VPN files; so, you have to build them yourself. Fortunately, Microsoft developers have created for this purpose a script called build.py, and it does everything automatically. The required dependencies are specified in the beginning of the script.
As soon as post-quantum algorithms are standardized, the global transition to post-quantum cryptography will start. Browsers will switch to post-quantum cryptography as well (Google has already conducted such an experiment in its Chrome Canary). The transition will be time- and labor-consuming, and some companies already offer assistance in the implementation of post-quantum cryptography (e.g. PQShield at the Oxford University and Russia-based QApp).