Join us for networking & quality resources to help you and your team succeed in digital transformation.
The history of technological advancement is marked by milestones. Einstein published his theory of relativity. Fleming’s discovery of penicillin. Watson and Crick uncovered the structure of DNA. While none of these had any practical significance at the time — and wouldn’t for decades — they all ushered in new eras of technology.
Google’s recent announcement that it had achieved quantum supremacy can be seen in a similar light. While the company proved that its quantum computer could solve a particular problem in minutes that would take a conventional supercomputer thousands of years, that problem itself is of little practical value.
However, the same could be said about the achievements of Einstein, Fleming and Watson and Crick. They were important not in and of themselves, but because of the possibilities they would unleash later on. The truth is that quantum supremacy is a harbinger for a future we can’t see yet. It marks a new era that will take us in completely new directions.
How the death of logic led to a universe of Ones and Zeros
The modern concept of computing began, strangely enough, with a hole at the center of logic known as Russell’s paradox, which showed that, under certain conditions, a proposition can be both true and untrue. The dilemma led to a series of problems posed by the mathematician David Hilbert to resolve what was seen as a foundational crisis in mathematics.
The matter at hand centered largely around three principles: completeness, consistency and computability. To the horror of many, in 1931 25 year-old Kurt Gödel published his incompleteness theorems, which showed that a system could be complete or consistent, but not both. Five years later, Alan Turing showed that not every number is computable either.
The hole at the center of logic would remain. Yet there was a silver lining to the hole affair. In proving his theory, Turing also created the concept of a universal computer, today known as a Turing machine. The idea, in a nutshell, was that a sequence of ones and zeroes, if it was long enough, could compute any computable number.
Turing’s idea became the basis for today’s digital economy. Using incredibly long sequences of ones and zeroes, computer chips allow us to do everything from sending emails and writing documents to searching for information and playing video games. It’s an amazing achievement, but also inherently limited.
Computing in three dimensions
Every technology eventually hits theoretical limits and digital computing is at that point today. The crux of the problem is that there are only so many transistors that we can fit onto a silicon wafer before subatomic effects start kicking in and begin mixing up all those ones and zeroes. So to advance further we need to come up with new ways of computing things.
One of the most viable options is quantum computing, which uses quantum effects like entanglement and superposition to calculate things. Unfortunately, quantum effects are pretty confusing. In fact, the great physicist Richard Feynman once remarked that nobody, even world class experts like him, really understands them. So that explanation isn’t very helpful.
A simpler and reasonably accurate explanation is that while digital technology computes in two dimensions (strings of ones and zeroes), quantum technology computes in three dimensions. The benefits of this should be obvious, because you can fit a lot more stuff into three dimensions than you can into two, so quantum computer can handle vastly more complexity than digital computers.
Another added benefit is that we live in three dimensions, so quantum computers can simulate the systems we deal with every day, like those in materials and biological organisms. Digital computers can do this to some extent, but some information always gets lost translating the data from a three-dimensional world to a two dimensional one, which leads to problems.
However, the opposite is also true. Quantum computers need to translate classical problems into quantum space in order to compute them. That’s why the idea of quantum supremacy has gotten so much attention. For the first time, Google was able to show that its quantum computer was able to overcome that handicap and solve a classical problem faster than a digital computer.
The evolving quantum ecosystem
We use milestones to mark the beginning of a new era. Einstein’s paper on relativity changed physics forever. Alexander Fleming’s discovery of penicillin created the new field of antibiotics. Watson and Crick’s discovery of the structure of DNA created genetics. In this sense, the attainment of quantum supremacy is a milestone event.
However, take a closer look and you’ll find that it takes decades to go from a discovery in a lab to a practical impact in the real world. The problem is that we need more than technology to create value, we need ecosystems of suppliers, customers, service providers and complementary products to create a real market.
For example, while the integrated circuit was invented in 1958 and the first PC was created at Xerox in 1973, it wasn’t until the late 90’s that we saw a real impact on productivity. It took that long to build up a strong ecosystem of software suppliers, systems integrators and networking technology (including the Internet), to deliver on the promise of digital technology.
As the journal Nature reports, we can see a similar ecosystem forming in quantum technology today. In addition to companies that are building quantum computing hardware, there is increasing investment in quantum software, communication, tools and sensors that will be necessary to make quantum technology useful.
Reimagining computing for a new era of innovation
It’s tempting to look at the achievement of quantum supremacy and see an accelerated version of the digital age. However, that’s almost certainly not going to be the case. Quantum computing isn’t just a more powerful version of today’s technology, it is a fundamentally different way of computing that will be applied to very different tasks.
To understand how different the future will be, think back to the early days of digital computing when the machines filled entire rooms. Nobody looked at those things and saw a device that would fit into your pocket and you would use to watch videos and send emoticons. Things that truly change the world always arrive out of context for the simple reason that the world hasn’t changed yet.
What is clear is that the digital era is ending, but much like Turing and the death of logic, this end also marks a new beginning. Quantum technology, as well as others such neuromorphic computing, will change our conception of what problems a computer can solve. Much like with digital computing, that will open up possibilities that are unimaginable today.
That’s why the challenge — and the opportunity— today lies in the later stages of the ecosystem, the consumers of the technology. That’s where real-world problems get solved. The value of a technology does not lie in its inherent capabilities, but how people learn to collaborate and apply those capabilities to deliver a meaningful impact on the world.
Article by channel:
Everything you need to know about Digital Transformation
The best articles, news and events direct to your inbox