The end of classical computing limits
8 May 2018
Most of us assume that, even with the help of modern computers, we never will be able to solve many of the problems that are taxing humanity, from degenerative diseases to sustainable energy. The classical binary architecture (1s and 0s) of the computers we use today, no matter how big we make them, negates that possibility. In order to solve problems such as protein folding (eg to find cures for Alzheimer’s) and superconducting efficiency (eg to improve storage and distribution of energy), we must emulate the way Nature itself does those calculations. This is essentially what Quantum Computers are. They are not “faster” computers, they are a whole new paradigm that moves us on from binary processing to quantum mechanics. In doing so, these computers are opening up possibilities that extend even to the full simulation of the human brain, which is also quantum mechanical.
Given that we already have autonomous cars on the roads, it’s easy to think that, given enough processing power, any problem known to mankind can be solved. However, there are limits to the current “classical” computing stack. One example of this is our inability to accurately simulate Nature. Three decades ago, Nobel Laurates Richard Feynman and Yuri Manin independently hypothesized a computer that takes advantage of quantum mechanics to create a computer that can simulate Nature, which itself is quantum mechanical.
By storing and processing information quantum mechanically, a “quantum computer” can take advantage of weird quantum features such as superposition (ability to store 1 “and” 0 instead of 1 “or” 0) and entanglement (ability to correlate regardless of distance). For example, a quantum computer with 300 Qubits (the equivalent of a Bit in a classical computer) can represent 2300 (about 1090) numbers, and manipulate those numbers simultaneously. A classical computer will need all the atoms in the universe to replicate just the storage capacity for such a feat. This means quantum computers, unlike the classical ones, are able to scale processing power exponentially.
While progress has been slow historically, given the recent leaps, we think the industry is at a critical juncture on the S-curve. Just as in the middles of the last century there were innovations in transistors and software all occurring at the same time to create the computing revolution, we are beginning to witness something similar in quantum computing. The physics of quantum computing has been known for a while. What now is being solved is the engineering. This is similar to the leap from understanding electricity to commercialising the light bulb by using a tungsten filament. IBM, for example, created a 7-Qubit test quantum computer in 2001 and, by 2017, it was producing a 50-Qubit quantum computer. The creation of a usable quantum computer (quantum supremacy), will irreversibly disrupt pharmaceuticals, agriculture, energy, security and AI verticals.
If you would like more information on this subject please click here