Quantum computing is among the more peculiar objects of occasional media coverage. It pops up once in a while in a news report or magazine article, it rings a bell in the heads of many, it is generously commented on by wannabe scientists, it raises ambiguous feelings of great progress and inadvertent threat; but it is hardly understood by anybody. What is behind the mysterious façade? Will quantum computing overthrow the world as we know it? Will it make encryption impossible as some purport? Or is it yet another marketing hype?
A few years ago, we have seen an enormous, yet short-lived, hype evolve around another promising technology: the blockchain. Unforgotten is the iced tea producer that rebranded itself as Long Blockchain Corp. and was rewarded with an overnight tripling in stock value. Not long after, the blockchain bubble burst. Currently, a Japanese company is offering to "activate" ordinary water into "quantum water" showing "numerous economic and health benefits"; businesses ranging from architects to agricultural producers to cloud storage providers proudly bear the "quantum" in their name. Are we facing an inevitable "quantum bubble"?
Before divesting our stocks in panic, we should notice some fundamental differences between quantum computing and the blockchain. First and foremost, the blockchain is a relatively new concept. The first public blockchain system, the Bitcoin, was 8 years old when Long Island Iced Tea Corp. undertook its rebranding and 9 years old when the Bitcoin price tumbled. Second, the blockchain is a software concept with interesting applications but not a field of genuine research and hardware development.
In contrast, "quantum" is a century old – older than computers. The theoretical basis for quantum technology was laid by physicists in the early 20th century. Quantum physics provides a highly abstract theory to explain the behaviour of the very smallest elements of nature, of the size of an atom and smaller. Their behaviour can be so counter-intuitive that Albert Einstein himself had difficulties to take them for granted ("He [God] does not throw dice."). In fact, until today the theory has not been reconciled with the – equally counter-intuitive – behaviour of the universe at large scale.
Quantum technology is the branch of engineering that tries to utilise the peculiar characteristics of the quantum world for specific technological applications, one of which is computing. However, engineering at such small scale is not an easy business. The slow speed of advancement in the field is a good indicator of the problems that engineers and physicists are facing. Quantum computers are massive machines that have to be placed in shock-absorbing cabins and cooled down to temperatures colder than outer space. Furthermore, the scientific development seems to be divided along the frontlines of economic rivalries. In 2019, Google claimed that its quantum computer Sycamore carried out a computation 1.5 million times faster than a classical supercomputer could; a claim immediately rejected by IBM. Until today, no quantum computer has been built that undisputedly solved a problem faster than classical high-performance computers.
Therefore, quantum computing is currently a mostly theoretical undertaking. Programmes and algorithms have already been developed, despite the lack of a machine to execute them. The knowledge of the potentials of quantum computing consequently is derived from theoretical considerations rather than real experience. However, the development of classical computing has been somehow similar. The concept of Turing machines, which applies to every modern computer programme, was developed before the first fully capable computer. What is it then that makes quantum computing so different from classical computing?
Classical computers use electrical currents to represent bits, pieces of binary information that are either 0 ("off"), or 1 ("on"). Quantum computers instead work with qubits, which are also binary, but they are represented by some very small particle with an observable property interpreted as either 0 or 1. In the quantum world, however, the properties of a particle are often not precisely known. Instead, the particle may be in a certain state between 0 and 1. As long as the particle's state has not been measured, it is considered to be in both states at the same time (just like Schrödinger's infamous cat is both alive and dead at the same time, as long as nobody opens the box). Only at the end of a computation, the state of all qubits is measured definitely. Then they will take either value 0 or 1, but during runtime, quantum algorithms can in clever ways exploit the whole continuous spectrum from 0 to 1. Thus extending the on-off binarity of classical computers is one reason why quantum computers can be significantly faster on some tasks than classical computers.
The attentive reader might have noticed that even in the world of classical computing, speed can vary immensely. IBM's latest supercomputer is certainly faster than a crappy old Windows XP. What does it then mean that quantum computers are generally "faster" than classical computers? In computer science, speed refers to the complexity of algorithms that can be run on the machines. The complexity is the additional time that a calculation requires if the problem size is increased by a certain amount. Imagine you had two different methods to do a calculation with some number n. Method A takes twice as much time every time you increase n by 1. Method B steadily takes one hour more for every increase of n by 1. Then for larger and larger n, even the crappiest Windows XP using method B will eventually overtake IBM's supercomputer running method A; at least in theory. This universal property of algorithms (read methods) is independent of the machines you run them on. Quantum computers are considered faster than classical computers because it is assumed that for some problems there are algorithms that are like method B that can only be run with quantum computers. All classical computers can only run method A.
It is assumed that algorithms for quantum computers are faster than classical algorithms because, in fact, we don't know. Quantum algorithms have been studied for decades and are, unlike the physical machines to execute them, well-developed. For many problems, a quantum "method B" has been found, while for classical computers, only a "method A" is known. It has, however, in many cases not been proven that no fast classical algorithm exists. There might be a classical "method B" out there that just hasn't been found yet. The quest for fast algorithms (or the proof of their non-existence) for a large class of problems is called the P-NP conjecture and is one of the most prominent open questions of modern mathematics.
It does not seem likely that generations of researchers have simply overlooked efficient algorithms for so many problems, but it has happened before, namely to the recommendation problem. Services like Amazon try to predict what users might like from their previous purchases and commonalities with other users. Calculating the next best recommendation is a mathematical problem that was for quite some time believed not to be efficiently solvable on classical computers. Only an efficient algorithm for quantum computers was known. But in 2018, Ewin Tang, a teenage undergraduate student from Texas, startled the research community with a "method B" for classical computers to solve the recommendation problem. She thus disqualified one of the more famous examples of tasks assigned to the quantum world.
Nevertheless, other tasks remain practically unachievable with classical computer technology, most prominently cryptanalysis. Some widely used modern cyphers can be broken by determining the prime factors of some very large number, the key. Although theoretically simple, classical computers take thousands of years to compute those prime factors, but for quantum computers, a fast algorithm is known. Therefore, as soon as someone builds a functional quantum computer, current encryption systems become useless. Although quantum-proof encryption is under development, encrypted information that has been intercepted or leaked earlier will be compromised. Luckily, as of today, no larger number than 15 could be factorised with a quantum computer, something that even a crappy Windows XP is able to do.
Comentarios