Security Vision
As we already wrote in previous articles, the task of ensuring information confidentiality is solved through encryption, which in modern infrastructures is implemented through a hybrid scheme, where asymmetric algorithms are used to encrypt the transmitted symmetric encryption key, which is then used to quickly encrypt data arrays. Encryption of data and the encryption keys themselves is necessary because classic data networks and storage devices face a fundamental problem: the possibility of intercepting information transmitted as electrical signals or stored in storage chips as electrical charges (logical 0 and 1 correspond to the minimum and maximum levels of electrical voltage). Such interception can be accomplished both at the logical level, for example, by copying files transmitted via the unprotected HTTP protocol, and at the physical level, by connecting to the contacts (pins) of the storage chip, to a twisted pair or fiber-optic communication line, and by analyzing changes in the physical environment by intercepting spurious electromagnetic emissions and interference (SEMI). Thus, reading (measuring) electromagnetic field strength is undetectable, since such observations do not result in changes to the original physical carrier. To use an analogy, it's impossible to know whether someone has seen the text printed in a book placed in a library's reading room for public access. But what if the physical principle of information storage allowed for data to be completely altered if an unauthorized attempt were made to read it, and computing devices operated not with just two values (0 and 1), but with multiples, performing all computational operations simultaneously? This isn't science fiction, but the reality of quantum technologies – a field that is actively developing and promises revolutionary changes in the near future.
Quantum technologies are based on quantum physics, one of the most modern fields of science, whose discoveries were made by such outstanding scientists as Max Planck, Max Born, Werner Heisenberg, Paul Dirac, as well as Albert Einstein and Niels Bohr (their scientific debate and the resulting "Einstein-Podolsky-Rosen Paradox" are considered a classic example of the development of the philosophy of physics). One of the practical results of this work was the emergence of quantum information science as a new field of science, and its practical implementation was the advent of quantum computers. While classical information science uses bits that can only take two values (0 and 1), quantum information science uses qubits (an abbreviation of the English word " quantum"). Quantum bits (or quantum bits), which can also exist in a superposition (combination) of their states. The values of qubits (0 or 1) are determined by the angular momentum of physical objects in the quantum world-photons, electrons in superconducting semiconductor materials (for example, gallium arsenide or a high-purity silicon isotope), and ions of various chemical elements (for example, strontium or ytterbium). The angular momentum of elementary particles is called spin and can be represented as the rotation of the electromagnetic field around the particle in a specific direction – for example, the spin of a photon (a particle of light) can be directed in the same direction or opposite to the direction of the particle's motion, which would correspond to a qubit value of 0 or 1 in the quantum system. A quantum computer with n qubits can simultaneously be in one of the "2 to the power n " states: for example, if a classical register stores 3 bits of information (say, the value 101), then a quantum register with three qubits already contains 8 (2 to the power 3) values (000, 001, 010, 011, 100, 101, 110, 111), and, for example, a 32-bit quantum register can store up to "2 to the power 32" values. Thus, a quantum system with qubits, compared to a classical system, can process much more information, the volume of which will be characterized by an exponential function with a base of 2 (i.e., "2 to the power n ", where n is the number of qubits in the quantum system). In addition, quantum computers obey the principle of quantum parallelism, in which calculations are performed simultaneously on all data stored in quantum registers.
Two other important quantum phenomena used in quantum information science are quantum entanglement and quantum mechanics. Entanglement (which can also be translated as quantum entanglement or mutual connectivity of states) and quantum tunneling (the tunnel effect). Quantum entanglement refers to the interconnectedness of the quantum states of all particles in a quantum system, even at very large distances – for example, in a system of two entangled photons, the spin of one will be opposite to the spin of the other at the moment of measurement. Furthermore, any measurement of the properties of a particle in the system leads to an instantaneous change in its state (called "wave function collapse" or "von Neumann reduction"), and the impossibility of creating a copy of a quantum state is enshrined in the "No-cloning theorem" – thus, an attempt to "read" the state of the system leads to its irreversible change and, as a consequence, the impossibility of copying. Quantum tunneling refers to a particle's ability to overcome the electrical potential barrier even with insufficient particle energy – for example, in superconducting quantum computing. quantum Computing using superconducting qubits exploits the Josephson effect, in which an electric current passes through a dielectric layer separating two superconductors (their junction is called a Josephson junction).
Currently, active scientific research continues in the field of quantum physics - for example, just recently there was an experimental proven the existence of a new phase of matter (topological semimetal), which will allow the creation of more reliable quantum computers. Russian scientists recently created a new type of qubit based on gallium arsenide, which allows for the simultaneous control of two qubit parameters (charge and spin) and the creation of more stable quantum computers in the future. various types of physical implementation of quantum processors (QPU, quantum processing Unit), but currently the most popular is quantum computing technology based on superconductivity. It utilizes advances in semiconductor technology and existing equipment, but achieving superconductivity requires cooling devices to near absolute zero (0 Kelvin, minus 273.15 degrees Celsius), providing complex electronic switching within the devices, and combating the negative effect of decoherence – the change in the state of a quantum system due to the properties of a non-ideal environment and defects in the materials used in the devices. Nevertheless, major companies are already producing quantum processors: Google's QPU chip Willow contains 105 qubits and is designed for error correction during operation, IBM's quantum processor Nighthawk operates with 120 qubits. It was recently launched in Russia. presented a prototype of a 70-qubit quantum computer, and was also implemented a quantum system with seven levels of quantum states.
From a cybersecurity perspective, quantum technologies are of interest in the following applications:
1. Vulnerability of classical cryptographic algorithms to quantum computing.
Many traditional cryptographic schemes ( RSA , Diffie-Hellman scheme for generating a shared secret key) are based on the mathematical complexity of the classical solution of the factorization problem (decomposing a number into prime factors) and discrete logarithm (the problem of finding the exponent x in the equation " g to the power x = a ( mod p )"). However, in 1994, American scientist Peter Shor, who worked in the AT & T laboratory Bell developed a quantum algorithm (later named in his honor) that allows factorization and discrete logarithmization to be performed at a speed close to the speed of solving a direct problem (i.e., calculating a product, raising to a power) - this is achieved due to the fact that in quantum computers, calculations are performed in parallel and with much larger volumes of information. In 2001, research the required computing power and the danger of using Shor's algorithm to crack traditional cryptosystems at that time, and in the same year a group of researchers from IBM implemented Shor's algorithm in its simplest form on a 7-qubit quantum processor. In 2023, Shor's algorithm was improved – the new scheme was named "Regev's Algorithm" in honor of its creator.
These studies and their practical implementation, despite their academic interest, raise justified concerns among the information security community: the prevalence of classical cryptosystems and the emergence of commercial quantum computers may threaten conventional encryption algorithms (primarily asymmetric ones) and digital signatures, as well as protocols for generating a shared secret key transmitted over an open communication channel (key encapsulation mechanism, KEM, Key). Encapsulation Mechanism, a development of the Diffie-Hellman scheme). In addition, approach Harvest now, decrypt "later" suggests that attackers are accumulating encrypted data and will attempt to decrypt it when quantum computers become available. Therefore, cryptographic schemes for encryption, digital signatures, and KEM mechanisms, considered secure at the time of their implementation, could be easily cracked in the foreseeable future. With this in mind, in 2016, the American institute NIST appealed to the cryptographic community to create new cryptographic algorithms, and then launched program post-quantum cryptography ( Post - Quantum Cryptography ) for the selection of cryptographic algorithms that are resistant to attacks from quantum computers (as well as classical, non-quantum devices) - i.e., for the creation of quantum-resistant or post-quantum cryptography (abbr. PQC , Post - Quantum As a result of this work, in 2024 NIST released three FIPS standards for post-quantum cryptography:
· FIPS 203 "Module-Lattice-Based Key-Encapsulation Mechanism Standard";
· FIPS 204 "Module-Lattice-Based Digital Signature Standard"
· FIPS 205 "Stateless Hash-Based Digital Signature Standard".
Migration to post-quantum cryptography standards is already underway, and by 2035, NIST will remove quantum-vulnerable cryptographic algorithms from its list of standards.
2. Using quantum cryptography for quantum key distribution.
The principles of quantum mechanics, including quantum entanglement, von Neumann reduction, and the No-Clone Theorem, make it possible to solve the problem of securely distributing symmetric encryption keys, which is achieved by using asymmetric algorithms in classical infrastructures. Quantum Key Distribution (QKD) allows the transmission of symmetric encryption keys over open communication channels using one of the specialized protocols ( the BB84 and B92 protocols developed by Charles Bennett, and the E91 protocol developed by Arthur Eckert). However, quantum key distribution also has its drawbacks: to confirm the key transfer, the sender and receiver still need to communicate via an alternative channel, the key recipient needs to verify its authenticity, the quantum system and the data transmission channel may be vulnerable to denial-of-service attacks due to their sensitivity, and the data transmission channel itself (e.g., a fiber optic line) must be dedicated. Furthermore, the entire quantum key distribution system is quite expensive and requires special operating and maintenance conditions.
3. Construction of quantum networks.
The fundamental principles of quantum mechanics make it possible to build a data transmission network with an unprecedented level of security. Similar to the transmission of encryption keys, the data itself can be securely transmitted using fiber optics or outer space (vacuum) as a channel. Such projects already exist: for example, in 2017, China launched commercial quantum network, and in 2021 there was also launched a quantum network with a total length of 4,600 km. In Russia in 2021, there were completed Construction of a quantum network between Moscow and St. Petersburg is underway. However, difficulties associated with the negative decoherence effect due to imperfect data transmission channels, as well as the high cost of equipment, are currently hindering development in this area.
4. Generation of truly random numbers.
Traditional computers generate pseudorandom numbers using a seed value and the properties of their hardware components. Quantum computers, on the other hand, can generate truly random numbers due to the uncertainty of quantum processes, and developers can use ready-made templates for working with quantum devices.