A new disruptive technology is on the horizon and it promises to take computing power to unprecedented and unimaginable heights.
And to predict the speed of progress of this new "quantum computing" technology, the director of Google's Quantum AI Labs, Hartmut Neven, has proposed a new rule similar to the Moore's Law that has measured the progress of computers for more than 50 years.
But can we trust "Neven's Law" as a true representation of what is happening in quantum computing and, most importantly, what is to come in the future? Or is it simply too early on in the race to come up with this type of judgement?
Unlike conventional computers that store data as electrical signals that can have one of two states (1 or 0), quantum computers can use many physical systems to store data, such as electrons and photons.
These can be engineered to encode information in multiple states, which enables them to do calculations exponentially faster that traditional computers.
Quantum computing is still in its infancy, and no one has yet built a quantum computer that can outperform conventional supercomputers. But, despite some scepticism, there is widespread excitement about how fast progress is now being made.
As such, it would be helpful to have an idea of what we can expect from quantum computers in years to come.
Moore's Law describes the way that the processing power of traditional digital computers has tended to double roughly every two years, creating what we call exponential growth.
Named after Intel co-founder, Gordon Moore, the law more accurately describes the rate of increase in the number of transistors that can be integrated into a silicon microchip.
But quantum computers are designed in a very different way around the laws of quantum physics. And so Moore's Law does not apply. This is where Neven's Law comes in. It states that quantum computing power is experiencing "doubly exponential growth relatively to conventional computing".
Exponential growth means something grows by powers of two: 2^1 (2), 2^2 (4), 2^3 (8), 2^4 (16) and so on. Doubly exponential growth means something grows by powers of powers of two: 2^2 (4), 2^4 (16), 2^8 (256), 2^16 (65,536) and so on.
To put this into perspective, if traditional computers had seen doubly exponential growth under Moore's Law (instead of singly exponential), we would have had today's laptops and smartphones by 1975.
This enormously fast pace should soon lead, Neven hopes, to the so-called quantum advantage. This is a much-anticipated milestone where a relatively small quantum processor overtakes the most powerful conventional supercomputers.
The reason for this doubly exponential growth is based on an in-house observation. According to an interview with Neven, Google scientists are getting better at decreasing the error rate of their quantum computer prototypes. This allows them to build more complex and more powerful systems with every iteration.
Neven maintains that this progress itself is exponential, much like Moore's Law. But a quantum processor is inherently and exponentially better than a classical one of equal size.
This is because it exploits a quantum effect called entanglement that allows different computational tasks to be done at the same time, producing exponential speed ups.
So, simplistically, if quantum processors are developing at an exponential rate and they are exponentially faster than classical processors, quantum systems are developing at a doubly exponential rate in relation to their classical counterparts.
A note of caution
While this sounds exciting, we need to exercise some caution. For starters, Neven's conclusion seems to be based on a handful of prototypes and progress measured over a relatively short timeframe (a year or less).
So few data points could easily be made to fit many other patterns of extrapolated growth.
There is also a practical issue that, as quantum processors become increasingly complex and powerful, technical problems that are minor now could become much more important.
For example, the presence of even modest electrical noise in a quantum system could lead to computational errors that become more and more frequent as the processor complexity grows.
This issue could be solved by implementing error correction protocols, but this would effectively mean adding lots of backup hardware to the processor that is otherwise redundant.
So the computer would have to become much more complex without gaining much extra power, if any. This kind of problem could affect Neven's prediction, but at the moment it's just too soon to call.
Despite being just an empirical observation and not a fundamental law of nature, Moore's Law foresaw the progress of conventional computing with remarkable accuracy for about 50 years.
In some sense, it was more than just a prediction, as it stimulated the microchip industry to adopt a consistent roadmap, develop regular milestones, assess investment volumes and evaluate prospective revenues.
If Neven's observation proves to be as prophetic and self-fulling as Moore's Law, it will certainly have ramifications well beyond the mere prediction of quantum computing performance.
For one thing, at this stage, nobody knows whether quantum computers will become widely commercialised or remain the toys of specialised users. But if Neven's Law holds true, it won't be long until we find out.
Alessandro Rossi, Chancellor's Fellow, Department of Physics, University of Strathclyde and M. Fernando Gonzalez-Zalba, Research Fellow, University of Cambridge.
This article is republished from The Conversation under a Creative Commons license. Read the original article.