Abstract
The possible risk of quantum computing is considered to be a bigger threat every day. Recently, even the Bank for International Settlements published an interesting report for analyzing possible quantum risks of a classical financial infrastructure [1]. Considering the rise of blockchain-based systems this threat might even be bigger. On the one hand, because these systems are pretty complex and contain many different parts with different cryptographic primitives. On the other hand, there are an increasing number of mission critical applications implemented on different blockchain platforms, like stablecoins, CBDC-s, deposit token or identity solutions. In this article, we propose a quantum risks assessment framework for different distributed ledger-based platforms to systematically evaluate quantum vulnerabilities of a DLT platform, to assess possibilities and impacts of different quantum computing attacks and to take steps for mitigating risk. The assessment framework has relatively easy integration possibilities with classical technology and risk management approaches.
Quantum world and quantum computing
Quantum computing is one of the possible emerging technologies that can disrupt many existing industries. The technology is based on quantum physical and mechanical aspects of photons, electrons and other particles, exploiting the dual wave-particle characteristics in the computation. On a deep physical level things work pretty differently than we would intuitively expect. To demonstrate the counterintuitive nature of the quantum world, one of the most famous experiments is the double slit experiment (Figure 1). In this experiment photons or other atomic particles are shot from an electron beam or laser gun to a plate by two parallel slits. Behind the plate there is a screen that observes the hit of individual photons. If our laser beam is sensitive enough and we can practically shoot the photons on a one by one basis and measure them on the screen, we will experience the photons hitting the screen on a one by one basis. If however we shoot many photons and we measure only the aggregated appearance on the screen we will get an interfering wave function. Things can get even weirder if we make measurements directly at the double slits as well.
There are many competing ideas and theories trying to explain the double slit and similar experiments, including heavy mathematics and complex physical explanations. Without covering these theories in detail, we can say that if we are looking at a small enough object, like a particle, a neuron or a photon it sometimes behaves as an object and sometimes as a wave or rather a wave of probabilities.
Figure 1. Dual-slit experiment
source wikipedia https://en.wikipedia.org/wiki/Double-slit_experiment
Although the theory behind it is still being investigated, computer scientists try to build computational models and actual computers based on this incomplete and sometimes inconsistent conceptual background. The basic conceptual building block is the so-called qubit. Similarly as classical computers use bits as a building block to hide the complexity of the physical hardware, like transistors or analogue circuits, quantum computers use qubits (quantum bits). A normal bit can have two values, either 0 or 1, a quantum bit can have both 0 and 1 in the same time all the possible values between 0 and 1 as well (Figure 2). The idea of having 0, 1 and all the possible values in between is called a superposition. It practically models the wave characteristics of the underlying physical particle. A qubit can be measured as well, if it is measured, a certain value will be measured that is either 0 or 1. At measurement, the wave characteristics of the particle collapses and the object kind description will dominate, causing the measured qubit to have a similar characteristics as a normal bit. Real strength of qubits compared to classical bits is manifested if we are able to use several qubits parallelly. Having n pieces of qubits in superposition states can practically mean that there can be two to the power of n computational state considered in the same time. It can bring in certain situations an exponential faster computational speed than classical computers.
Even if quantum bits can be much faster than classical computers, building real life or sometimes even theoretical quantum algorithms can be pretty challenging. One way of building quantum algorithms is to capitalize additional physical elements of wave particle duality, like entanglement or interference. Entanglement means connection between two qubits, in a sense that theirs’ state is not completely independent from each other. Although both qubits are in superposition, meaning that they are in the state of zero and one and all the possible values in between, if we measure a qubit it will determine the measured value of the other qubit as well. The exact physical explanation of entanglement is still not fully underwood. However it does not prevent computer scientists from using a kind of deterministic connection between qubits to build quantum algorithms. One way of implementing a quantum algorithm is to use atomic deterministic connection between qubits as building blocks, called as quantum gates. A quantum gate transforms the superpositioned state of a qubit or several different qubits into something else, having a similar role than classical gates play in classical computers and digital circuits. Quantum gates can be combined as well into more complex transformations, called quantum circuits.
Figure 2. Quantum bit, qubit
source wikipedia https://en.wikipedia.org/wiki/Qubit
As a demonstration, Figure 3 contains a very simple quantum circuit. A quantum circuit contains qubits for quantum computation and classical bits for evaluating the result of the quantum computation. It usually starts by quantum bits in a zero state that are put into superposition with the help of a special circuit, called the Hadamard gate. Calculation is realized by applying different quantum gates to the superpositioned qubits. The algorithm is end by making a measurement for the qubits and storing the result in classical bits.
Figure 3. A simple quantum circuit
Source, wikipedia, https://en.wikipedia.org/wiki/Quantum_circuit
With the help of quantum circuits, there is a way of implementing quantum algorithms and analyzing the possible impact of quantum computers, even if the necessary hardware is still not or only partially available. One possible application of quantum computers is to calculate the so-called computationally difficult problems in a more efficient way. Important examples are solving prime factorization or elliptic curve multiplication in a more efficient way. Due to the massive parallelization of qubits, these problems can be theoretically solved exponentially faster with a quantum computer than with classical ones. It causes serious concerns for cryptography practitioners as most of our currently used cryptographic protocols, like secure and private data exchange use one of the previously mentioned mathematical problems.
It is important to point out that quantum computers can have different computational models than quantum circuits and the most interesting application area is not necessarily breaking cryptographic protocols. Quantum computers are expected to be better in solving different optimization and simulation problems than competing with classical computers in running traditional algorithms faster. This direction is called Quantum Annealing (QA) which is an optimization process for finding the global minimum of a given objective function over a given set of candidate solutions (candidate states) [2]. Applications of such optimization range from running financial market simulations, via designing better medicines, to weather forecasting or modeling climate change. From a blockchain perspective quantum algorithms cracking cryptographic primitives play the most important role.
Realizing a quantum computer is far from being trivial at the moment. It requires extreme physical conditions, like temperature near absolute zero. Another problem is that most of the nowadays quantum computers are not too stable. It means on the one hand, that the realized qubits have a very high error probability and realizing efficient quantum error correction is still challenging. On the other hand, most systems get decoherent pretty fast due to physical limits, so only small quantum programs can be planned to run. Quantum computers are definitely not science fiction, there are already running quantum hardwares at least up to 1000 qubits. There are even quantum cloud platforms where even everyday users can run programs on quantum computers as jobs with the help of cloud infrastructure [3]. Although the current size of current computers are not really big enough for solving real life problems, due to the exponential development in the field this can radically change in a decade.
Quantum computing and cryptography
Quantum computing might have a lot of interesting applications, it is most important in relation with cryptography if we consider blockchain applications and challenges. As we mentioned in the previous chapter, one possible threat to cryptography is the speed or rather parallelization possibility of quantum computers. They are capable of solving specific hard mathematical problems much faster than classical computers. These hard mathematical problems are often the basis for cryptographic primitives, causing a serious cybersecurity threat. From a practical point of view there are two quantum algorithms are considered to be critical for any applications containing cryptographic primitives:
- Shor’s algorithm is a quantum algorithm finding the prime factors of an integer in polynomial time [4]. Advanced usesages of the algorithm are able to break elliptic curve factorization as well. The problem is that some of the most important cryptographic protocols are based on prime or elliptic curve factorizations. Most criticals are RSA and ECC cryptography, which is the basis of public - private key cryptography, digital signatures, key exchange protocols heavily used in all nowadays protocols like in TLS. The real problem is that these problems are exponential (or believed to be exponential) at the moment, however with the help of Shor’s algorithm they can be solved in polynomial time making practically these cryptographic techniques unusable. Although current quantum computers are not advanced enough to break currently used, like 2048 bit RSA crypto systems, the quantum threat is considered to be real. As most secure communication channels use the previously mentioned problems, there is a possibility to store secret communication now, and decrypt it in 10 years as storing enough quantum computers will be available. This is the so-called harvest now, decrypt later attack. For some of the mission critical data, like sensible financial information that should be kept private for a longer timeframe, this can pose a serious threat.
- Grover’s algorithm is a quantum algorithm for searching in unstructured data. Instead of polynomial time with a quantum computer, the search with the Grover’s algorithm takes square root steps, reaching a quadratic speedup [5]. Grover’s algorithm can be used to crack several well-known cryptography primitives as well, like symmetric encryption (like AES) or cryptographic hash functions (like SHA-256). The impact of this algorithm is not so critical compared to the Shor’s algorithm, as there is just a polynomial, quadratic speed-up and not exponential. As a consequence, most of the affected cryptographic protocols and primitives can be improved with increased key-size to be resistant to Grover’s algorithm. Although, there is no need to invent brand new cryptographic algorithms here, changing the key size of a symmetric cipher or hash function on a live blockchain system might be challenging. It might require serious planning, preparation and testing.
To prevent vulnerabilities of cryptography against quantum attacks, there are two possible directions that are heavily investigated. On the one hand, post-quantum cryptography tries to invent new cryptographic technologies that can be run on classical computers but are vulnerable to quantum attacks, like the Shor’s or Grover’s algorithm. They are based on difficult mathematical problems, different from prime or elliptic curve factorizations, that are hard to solve even by quantum computers. Quantum cryptography on the other hand consists of brand new cryptographic protocols that can be implemented and executed only by quantum computers. Independently if we speak of quantum or post-quantum protocols, inventing and standardizing a brand new cryptographic primitives is always a long-running and challenging process that requires a lot of contribution among the top cryptographers of the world. To speed-up standardization and the invention of stable and usable post-quantum algorithms, NIST created a standardization challenge. In this challenge several post-quantum algorithms in several rounds have been competing with each other to become the standardized post-quantum cryptography of the future [6].
The NIST post-quantum cryptography challenge started back in 2010 to help the world to be ready for quantum attacks against cryptography. In 2016 a global competition started where individual independent teams from around the globe could come out with brand new quantum resistant algorithms and let them evaluate. The purpose of this competition was to find and standardize the next series of cryptographic algorithms that are resistant both to quantum and to traditional attacks as well. The competition has been taking place for several years, containing multiple rounds of submission and evaluation. The proposed algorithms were evaluated based on different criteria, like performance, complexness, quantum and classical vulnerabilities. From the originally proposed 69 candidates there are 7 finalists and 8 alternative solutions selected. The candidate algorithms are based on hard underlying mathematical problems that are believed to be hard to crack even by quantum computers. These mathematical problems are the followings [7]:
- Lattice-based cryptography: built on the mathematical concept of lattice, where lattice is a mathematical structure of abstract algebra.
- Hash-based cryptography: creates digital signature algorithms based on cryptographic hash functions, exploiting the idea that hash function are more resistant to quantum attacks (see Grover’s algorithm)
- Multivariate cryptography: built on multivariate polynomials over a finite field.
- Code-based cryptography: is based on error-correction codes and exploits the difficulty of decoding a linear error-correcting code.
- Isogeny-based cryptography: is based on walks in a supersingular isogeny graph.
At the end of the 3rd round there are a few algorithms remaining in different categories that are candidates for standardization. These are mostly based on lattice based cryptography. Defining a cryptographic standard is a long-running activity, it takes several years of research and community interaction and hacking attempts until the protocol is regarded as secure. For this reason besides the finalist algorithms, there are a few alternatives considered as well. Important finalist algorithms are the followings:
- Key establishment, encryption: CRYSTAL-Kyber:
- Digital signature:
- CRYSTAL-Dilithium
- Falcon
- SPHINCS+
Besides post-quantum cryptography, a very interesting initiative is quantum cryptography. Quantum cryptography means cryptographic algorithms running on quantum computers, or based on quantum physics. Although the field seems to be a little bit speculative, because actually there are still no efficient enough quantum computers running real quantum algorithms, there are some real life applications. These applications rather focus on different quantum physical effects and are not necessarily based on structured quantum computers with qubits, quantum gates or circuits.
One of the interesting and very useful fields is random number generation. Using a good, cryptographically secure random number is highly critical for example at key generation of every system containing cryptographic components, including TLS key or private keys for identity (like private keys of Bitcoin addresses). If the key is not generated with a good enough randomness, it can be guessed by a hacker and stored money can be stolen. One such example is the mindwallet, in which a hash value of a secret passphrase was used as a private key. It turned out that it is insecure, simply because a passphrase chosen by a human does not have enough entropy. One novel approach is to use a device based on quantum devices to guarantee that the generated random number is a real random (QRNG quantum random number generation). Real randomness is guaranteed by quantum physics. Such devices are already available for a reasonable price.
Another practical quantum application is the so-called quantum key distribution (QKD). It is a key distribution protocol between two parties to come to an agreement of a common secret. It is based on quantum mechanical properties of a laser beam that makes it impossible for an eavesdropper to find out the common secret. The quantum physical mechanism behind is called the no-cloning theorem [8]. Although QKD protocols exist in real life, they are not yet practical, scalable or economically feasible. Nonetheless, there are existing initiatives and elaborated protocols, such as the BB84 protocol.
Quantum threat of blockchain systems
Considering blockchain based applications and platforms, there are several areas where quantum risk can be a serious threat. It is getting especially crucial because there are more and more blockchain applications that are mission critical. Examples are:
- Payment: There are many blockchain based payment applications, from cryptocurrencies via stablecoins to more regulated CBDC (Central Bank Digital Currency) use cases. These are regarded as massively mission critical applications, the security of such systems is critical, even under quantum advisory.
- Store of value: Some of the cryptocurrencies are not used as payment but rather as a store of value. At store of value use-cases it is even more critical to have hacking resistant systems, because such systems are supposed to store value for 10 - 20 - 30 years. Hence, a possible quantum hack even if it affects only one account might cause severe economic damage for the rest of the network as well.
- Tokens of financial institutions: There are some innovative use-cases for tokens issued by regulated financial institutions. Examples might range from deposit tokens, to financial security tokenization. In such use-cases security, hacking resistance and even quantum resistance can be highly important, because a possible vulnerability might not only cause financial loss but a serious reputation loss at the issuing institute.
- Blockchain and identity: Identity use-cases such as self sovereign identity or decentralized identity solutions are usually used together with an identity blockchain to improve data authenticity and consistency. Most of the identity use-cases are considered to be highly mission critical, a possible quantum hack can cause serious damage,
To analyze possible quantum threat of a blockchain platform or blockchain based application, we propose the following framework, with the following systematic evaluation steps (Figure 4):
Figure 4. Quantum risk evaluation process
1. Threat model: Assessing attack models, possible hacks and applied cryptographic elements of the system.
2. Impact analysis: Determining the impact of a successful quantum attack.
3. Quantum readiness: Analyzing how far in the future is the threat of this attack.
4. Prevention possibilities and risk mitigation: Evaluating whether an upgrade of the system, like with post-quantum cryptography, will protect against a category of threat.
5. Overall risk evaluation: Aggregating information to create a high-risk, medium-risk, low-risk evaluation for certain points.
6. The assesment process: Establishing ongoing assessments for technology management and organizational aspects with regular evaluation periods
In the following we will investigate the individual substeps more in detail.
Threat model
Assessing a threat mode of a blockchain system might be considered from several angles. There are more or less formal methods and more or less systematic approaches. One systematic approach is the so-called STRIDE model [4], other initiatives try to bring formal verification methods into the field.
If formal threat model analysis is not available we can use a more pragmatic approach as well. From a practical point of view assessing a threat model can contain listing the different possible attacks against a system. Possible attacks are usually categorized as follows:
- Attack against the consensus: like trying to manipulate consensus by having 51% of the voting resources.
- Attack against the network: like separating part of the network nodes by hacking the network.
- Attacking a node: like attacking a specific node, typically the leader in a Nakatmoto consensus.
- Smart contract attacks: like exploiting smart contract implementation vulnerability.
- Attacking private keys: like stealing or forging private keys.
- Application level attacks (dApp-s, DeFi): like flash loan attacks against certain DeFi protocols.
Another way of assessing elements of a threat model for our blockchain system is to evaluate the used cryptographic primitives, which we can call as cryptographic inventory. Quantum computing will primarily attack computational hard elements of these cryptographic elements, like prime factorization or elliptic curve exponentiation or multiplication. Basic blockchain systems contain mostly hash functions and digital signatures. However these core cryptographic elements are usually extended by further cryptographic primitives in most recent DLT designs, such as:
- Digital signatures for authenticity or identity proofs
- Public - private key cryptography for asymmetric encryption, like for key exchange in secure communication protocols.
- Hash functions for data consistency or as a cryptographic primitive for other protocols, like mining in Nakamoto consensus.
- Random generation: used in key generation in wallets.
- Multi-signatures, like Shamir’s secret sharing: used for example in wallets for splitting the signing keys into several subparts.
- Zero knowledge proofs, like Groth16: used for example in rollups or in realizing private blockchain transactions.
- Stealth addresses: used in Monero for increasing privacy features of the system.
- Ring signatures: used in many use-cases, like for increased privacy.
- Commitment schemes (e.g., hash commitment, Pedersen commitment): used for instance in private non-repudiability schemes.
- Further advanced cryptographic constructs, like functional or homomorphic encryption: used
Although assessing the cryptographic identity might seem to be a good start, it can be sometimes misleading. The problem is that some of the cryptographic primitives might be used in several complex protocols as building blocks. As a consequence it should be carefully analyzed the possible impacts of quantum vulnerability of a given building element. Another point to consider is a possible quantum attack to a non-cryptographic building block. As an example, quantum machine learning might be capable of guessing human generated passwords efficiently in the future. As a consequence cryptographic inventory should be used with care after a detailed analysis. Due to fact however that most blockchain applications do not have explicit threat model, we usually start with a list of the used cryptographic building blocks.
As an example considering the Bitcoin network, there is mostly the classical cryptography in the system that can be digital signatures based on elliptic curve cryptography for providing a kind of identity proof for the holders of bitcoin. Cryptographic hash functions like sha256 or optionally RIPDEM-160 have more application areas. Most important areas of hash functions in the Bitcoin protocol:
- Hash functions are used for providing consistency in the ledger.
- Hash functions are used in the mining process. Practically the pre-images resistance of the cryptographically secure hash function guarantees that the necessary work was invested into the proof of work process. It also guarantees that the chosen leader in the Nakamoto consensus was chosen by real randomness.
- Hash functions are used in address generation to provide an additional layer of quantum security. Bitcoin addresses are not solely the public keys, but the double hash of the public keys. The reason for that is that from a public key based on elliptic curve cryptography it is possible to derive the private key with the Shor’s quantum algorithm. Having a double hash on it provides additional quantum security though.
- Hash functions are used in address generation of the HD (hierarchical deterministic) wallets as well, to provide addresses that can be used only once. This provides extra privacy on the Bitcoin network.
Taking Hyperledger Fabric as another example, we find mostly the application of the asymmetric cryptography and cryptographic hash functions as well. Public private key cryptography is used in digital signatures for providing identity for organizations and organizational roles. It is also used with hierarchical certificate authorities and with X509 certificates to provide a PKI based role and identity system. On top, communication between Fabric components is based on TLS that uses PKI for key exchange for symmetric encryption keys as well. Cryptographic hash functions are the basis of several other components. They are used in ensuring ledger consistency and in digital signatures as a building block of the signing process. Besides classical cryptographic primitives, there are initiatives of advanced cryptography as well. As an example the identity mixer component of Hyperledger Fabric uses zero knowledge proof for advanced privacy.
Impact analysis
The next step of the quantum threat analysis is to assess the possible impact of a successful attack. In terms of impact we can consider two dimensions: category of impact and severity of impact. Category of the impact describes what kind of problem can be caused by a quantum attack; severity tries to evaluate how serious the problem can be.
Categorization considers the possible effect of an attack. An attack can be different for example if it targets system privacy or if it is a data integrity hack, or simply a false impersonation. The most important categories are the followings:
Confidentiality: information can be accessed only by the authorized parties, e.g. secrets
Integrity: data can be edited only by the authorized party
Availability: data or the service available, it can be accessed online without system outage.
- Authenticity: Impersonating or forging fake identity, like spoofing
- Integrity/Consistency: Forging and hacking relevant information in the system, like tampering.
- Non-repudiability: Denying the initiation of a critical transaction.
- Privacy: Gaining access and visibility into protected data and information.
- Availability: Degrading the availability of the system for, example by, denial of service attacks.
- Authorization:Elevating the privilege with protecting access for users without proper rights.
For evaluating impact severity, there can be many methods. One method might be to try to assess the economic impact of a successful quantum attack. As an example, if a Bitcoin address is hacked it causes direct financial loss up to the hacked amount. It might be however an even more significant economic loss caused indirectly by the loss of reputation as a stable store of value. It might cause a significant loss in the Bitcoin price implying an indirect loss af the cryptocurrency holder. Similar economic loss mechanism is possible with consortium networks realized by Hyperledger Fabric. Although in such situations estimating the exact direct or indirect economic loss is even more complicated. It heavily depends on the application as well that Fabric realizes.
As measuring the exact economic loss of an attack might be quite challenging, an optional way is to assess the impact of a threat on a qualitative scale, ranking how serious the possible attack might be. The easiest way is to evaluate on a three level scale, having low impact, medium impact or high impact. Examples for different impacts can be:
Figure 5. Impact severity
- If the digital signature algorithm is cracked then fake identities can sign transactions giving potential the possibility to steal cryptocurrency or economic value. This is definitely a high impact threat.
- If hash function has problems at pseudonym key generation, it might cause a decreased level of privacy on the network. It is certainly not nice, but perhaps not tragic. It might be considered as a medium impact hack.
- If the applied hash function leaks information on the preimage at mining, it can cause serious problems for systems using proof of work. At other consensus mechanisms, however, it is perhaps not so critical, so it might be considered as a low impact hack.
Considering the Bitcoin network, cracking the elliptic curve cryptography of the digital signature by a quantum attacker can have different impacts. If it is a normal address it might not be so tragic, because most of the addresses are protected by an additional hash function which is not so easy to crack even by quantum computers (Grover’s algorithm). So we can consider it a low impact hack. If we consider an early stage address, like all the Nakamoto addresses, they are without extra hash protection and usually store relatively big value. So, hacking such an address can surely be regarded as a high impact hack.
Considering Hyperledger Fabric the situation depends on the exact use-case as well. We can surely identify compromising a certificate authority and issuing false but valid certificates to be a high impact hack. On the other hand, quantum vulnerabilities of cryptographic hash function maintaining the consistency of the blockchain are not so critical. Due to the consortium, multi-company setup there might be a way for easy identification and a good coordinated resolution for such attacks. Nevertheless, we would probably identify as a medium impact hack.
Quantum readiness
Another aspect of quantum analysis is the question of quantum readiness. In other words, when will quantum computers be ready to exploit the vulnerabilities of the different cryptographic primitives in real life. There are many different estimates when quantum computers will be really capable of cracking currently used cryptographic primitives. Although the question seems to be simple, it is far from being trivial to answer. There are two problems here to consider:
- On the one hand some quantum computers, like D-Wave, are not capable of running quantum gates or circuits but rather designed for optimization problems. As a consequence, it does not matter how many qubits they have, they will not be able to crack current cryptographic primitives.
- On the other hand, even quantum computers designed to run for example the Shor’s algorithm, are pretty unstable and full of errors. Error correcting algorithms of quantum computers are pretty strongly under research.
Nevertheless, there are estimates that a few millions of physical qubits can already break currently used crypto systems, like 2048 bit RSA. The biggest quantum computer at the moment is said to be around 1000+ qubit, so we might as well argue, there is still time until the quantum threat will be real. It is not necessarily fortunate however to underestimate the speed of an exponentially developed technology. If quantum technology really reaches its tipping point, efficient enough quantum computers can be here in years.
Instead of giving an exact estimate when quantum computers will be available, we will use a similar quantitative scale as in the previous sections (Figure 6).
Figure 6. Quantum readiness
Due to the complexity of the field, we propose a simple three level scale to evaluate how far a quantum threat might be a real threat, having the expected in 10 years, somewhere in 10 to 20 years and over 10 years categories.
As an example, cracking currently used cryptographic elements with quantum computers based on prime factorization or elliptic curve can be expected in 10 years. The same is not necessarily true for hacking the currently used hash functions.
Prevention / Risk mitigation
The last important question to be answered is if we have any idea how to prevent a possible quantum threat or how to mitigate the risk of a possible quantum attack. Generally the following ideas might be considered:
- Increasing key size: Symmetric ciphers or even cryptographic hash functions are meant to be secure against quantum attacks with increased key size, because Grover's algorithm can crack these ciphers only in n(square) time. Unfortunately, the same is not true for digital signatures.
- Post quantum cryptography: These are based on classical computer algorithms that are secure against quantum attacks. The basis of such algorithms is usually a computational hard problem that can not be solved efficiently, not even with quantum computers. The area is under heavy research and the standardization is still in progress, but there might be standardized quantum resistant protocols in the near future.
- Quantum cryptography: Cryptography with quantum computers. Working solutions can be found, for instance, for key exchange or random number generation. These solutions, however, require special hardware considerations and setup.
Another blockchain specific aspect to consider is the immutability of distributed ledger protocols. As data can not be deleted from the ledger, it is questionable what can be done with all the cryptographic elements stored immutably on the ledger. Most typically, digital signatures are written into the ledger that are related very strongly to identity verification and data / ledger consistency. Modifying the cryptography behind such a signature can be quite challenging. In terms of blockchain, cryptographic primitives and upgrade, two categories can be distinguished here:
- Ledger dependent upgrade: immutable data / signatures on the blockchain.
- Ledger independent upgrade: cryptographic primitives can be changed independently from the ledger.
Similarly to the previous sessions, we can consider the prevention possibilities from a practical perspective, having a three level scale again:
Figure 7. Quantum risk mitigation, threat prevention
- In the “easily mitigate” category, we have a detailed procedure on what to do if a certain quantum threat becomes realistic. The procedure is well elaborated and tested in details.
- In the “somehow mitigate” category, we have an idea what to do if a certain quantum threat becomes realistic, but the details are still unknown. As an example, TLS protocols will probably be migrated to quantum resistant protocols having post-quantum cryptography. The details are, however, still unclear, as the whole area is in active research and standardization.
- In the “not known how to mitigate” category, it is still not clear what to do if that quantum threat looks to be close.
There are some further aspects to consider here. On the one hand, upgrading cryptographic primitives in complex systems, even if there are standardized cryptographic protocols, can be quite challenging. It can require months or even years of preparation and testing. For this reason such processes should be prepared in time.
One design technique might help in such upgrades that is called cryptographic agility. It means that the system is built in a way that the used cryptographic primitives are designed to be replaceable. It usually needs a kind of modular systems design having cryptography as a special module that is subject for upgrade and replacement. Although such a design sounds great in principle, it has some difficulties. On the one hand, designing a modular system for cryptographic primitives is one thing, but designing the system in a way that modules can be replaced on-the-fly in a running system is much more challenging. It is even more challenging if we consider a blockchain system having cryptographic primitives written into the ledger.
A general idea for for example post quantum signature migration can be having new quantum secure addresses in the system and migrating all the data and application from the old addresses to the new ones. After such a migration process, the old quantum vulnerable addresses can be regarded as invalid. Although it sounds good generally, it is more difficult in practice. For example in a Hyperledger Fabric, the whole application must be designed in a way that such a migration for new addresses can be executed and makes sense. In the Bitcoin network, this migration would mean transferring Bitcoin to new quantum secure addresses. The problem is however that owners of the old addresses will not necessarily transfer due to several reasons. As an example, in Bitcoin the initial Nakamoto addresses have never been transferred for some reasons. We might expect that they will not transfer even in a post-quantum migration.
Overall risk evaluation
Finally, risk factors of the previous chapters must be aggregated into one risk level. We would propose to use here again a qualitative three level scale, like high risk, medium risk and low risk. It is certainly a question how we can aggregate the individual levels into an overall estimate. Generally, we would propose the following rules:
- If we have a high-impact threat that we expect will be available in 10 years and we do not know how to prevent, that will certainly be a high risk. RSA or elliptic curve based digital signatures can be regarded to be in this category. Efficient attack against prime or elliptic curve factorization will probably be available in 10 years and identity theft causes serious negative impact and prevention or mitigation techniques are still under research.
- Having a high-impact threat that will be realistic in 10 - 20 years with at least an idea how to prevent it can be considered a medium risk. As an example, quantum attacking the cryptographic hash function in the blockchain for hacking consistency can be regarded to be in this category. Having an inconsistent ledger has surely a high negative impact. Quantum attacking for example an sha256 is not so easy even with the Grovers algorithm, so an efficient hack might take here 10 - 20 years. There are good risk mitigation techniques here, like using sha512 instead of sha256.
- Dealing with low impact threats with well known prevention possibilities can be evaluated as low quantum risk.
Figure 8. Overall risk evaluation
The assessment process
It is important to point out that the above described assessment is not a one point snapshot.It should be reevaluated continuously. As a general evaluation process, we propose the followings:
- First system analysis and evaluation: It might be a couple of days workshop with industry and systems experts and roles on the following fields: system architects, security architects, cryptography experts, post-quantum cryptography experts and quantum computing experts. The result of the workshop is the first documented quantum risk analysis of a certain blockchain-based system.
- Reevaluation periodically (like every year or two years): during the reevaluation, advancements of the following areas should be analyzed:
- Advancement in quantum computers, like the amount of available qubits
- New quantum algorithms available
- New results in attacking classical cryptography with quantum computing
- Advancement in post-quantum cryptography, like NIST standardization
- Usability (technical and economic) of quantum cryptography
Conclusion
In this article, we tried to come up with a high-level approach to assessing quantum threats and quantum risks of a blockchain-based system. It is important to note these fields are extremely complex: quantum computing, cryptography, post-quantum cryptography and blockchain systems are complex fields individually as well. Additionally, real life DLT systems might even be more complicated containing, for example, L2 solutions, blockchain interoperability or hybrid blockchain initiatives.
Although real quantum attacks are still years ahead, it can never start to assess risks of the field early enough. One reason is that such complex systems to migrate to quantum secure versions might take years of preparation, experimentation and testing, especially if we consider blockchain infrastructure and mission critical applications on top. Another reason, that even if efficient enough quantum computers probably do not exist, there are quantum attack models already that can exploit privacy in a later date having data stored now (store now, harvest later attack). In this sense, having quantum attack analysis of our systems is highly critical even now.
Glossary
References
[1] Project Leap: quantum-proofing the financial system, BIS Innovation Hub, 2024
https://www.bis.org/about/bisih/topics/cyber_security/leap.htm
[2] Quantum Annealing, wikipedia, https://en.wikipedia.org/wiki/Quantum_annealing
[3] IBM Quantum Platform https://en.wikipedia.org/wiki/IBM_Quantum_Platform
[4] Shor's algorithm, wikipedia, https://en.wikipedia.org/wiki/Shor%27s_algorithm
[5] Grover’s algorithm, wikipedia, https://en.wikipedia.org/wiki/Grover%27s_algorithm
[6] NIST, Post-Quantum Cryptography project
https://csrc.nist.gov/projects/post-quantum-cryptography
[7] Post-quantum cryptography
https://en.wikipedia.org/wiki/Post-quantum_cryptography
[8] No-cloning theorem
https://en.wikipedia.org/wiki/No-cloning_theorem