Monday, August 12, 2019

Artificial intelligence, quantum computing and the laws of encryption


The last decade has seen several science and technology breakthroughs. From self-driving cars to 3D-printing, clean energy technologies to artificial intelligence assistants, progress has been swift. While some technologies take decades to become useful, others disrupt quickly. In 2019, two major technologies have been making headlines but aren’t being taken very seriously – #artificial_intelligence (AI) and quantum computing (QC). These technologies would change the nature of cyber-attacks. Artificial intelligence can be used to not only probe but also to specifically tailor attacks against organizations and other targets. We’ve already seen some instances of #AI used to copy the voice and mannerisms of a person to create something that looks and sounds as though the real person said it called “deep fakes”.
Quantum computing took off in the early 90s and is now emerging as the next generation of computing. Operations that take hours and days, will happen in seconds with quantum power. With that #technology, the scaling of computations goes up dramatically, to the point where the time needed for breaking traditional encryption would shrink to weeks, or maybe even minutes. This means breaking some of the foundational encryption we see in use today. The estimates for when QC will really take off range anywhere from 5 to 20 years. One thing we do know, however, is that QC has the potential to completely transform the cyber threat landscape.



That said, quantum computing poses risks to some cryptography algorithms. For instance, public-key cryptographic algorithms, which are based on the discrete logarithm problem, elliptic curve logarithm problem, and integer factorization problem (RSA encryption) are susceptible to brute-force attacks using Shor’s algorithm. Whoever develops the quantum computers first would be able to break legacy encryption protecting historical information. Parallel to the development of quantum computing has been that of “post-quantum” or “quantum-resistant” cryptography to create encryption mechanisms that are resistant to quantum computing decryption capabilities. It remains to be seen whether these will achieve widespread adoption prior to quantum computers’ ability to trivialize existing encryption schemes.
While threat actors may use quantum computing to defeat some encryption algorithms, we expect that the adoption of quantum key distribution (QKD) will increase the secrecy of communication networks. The nature of quantum key generation and distribution guarantees communication systems’ security because the observation of a quantum-generated key will necessarily degrade or otherwise alter the key in a detectable fashion. As a result, we predict this will severely inhibit traffic interception schemes, as recipients would be able to identify messages that have been viewed prior to their receipt.
As of today, quantum computers exist, and developers can access them through the cloud. However, current quantum computers have some limitations, including the instability of quantum computing environments, which makes their practical use more difficult. Researchers are currently working to mitigate these inhibitions. It should be noted that quantum computing is still primarily in the research and development phases; large-scale application production and rollout has not occurred yet. Companies and countries are spending millions of dollars to win the race to get there first. U.S. quantum computing development has achieved good performance in terms of the raw number of qubits (72-qubit processor); however, China currently has the record on experimentally demonstrating an 18-qubit entanglement that is the basis of quantum computation and quantum communication. China may be behind in raw quantum computing hardware, but they are making good headway on finding applications for quantum computing once it becomes a reality. While quantum computing is still years away from becoming a conventional technology, it is a tight arms race.

No comments:

Post a Comment