By Ivan Ostojic and Rodney Zemmel
This post is based on a roundtable with John Martinis, professor of physics at UC Santa Barbara; Pete Shadbolt, cofounder of PsiQuantum; and Matt Trevithick, COO of Google Quantum AI.
For any emerging technology, the concentration of activity and investment around it says a lot about its maturity and what lies ahead. The vast majority of investment in quantum computing to date—more than 80 percent—is going into hardware or its components, which are needed to unlock the opportunity. The next phase will focus more on software and applications. This area is currently receiving interest from start-ups but not on the same level of broad support as hardware.
Sizing the prize
Despite the uncertain timetable and path forward for quantum computing, companies are salivating over the potentially massive market opportunity. For example, charge-transport modeling in the battery-technology business could be worth at least $5 billion yearly. Big pharma could apply quantum computing to protein structure and interactions, an opportunity worth $200 billion. Telecoms could use it to optimize their capital spending, worth a potential $50 billion to $70 billion. These applications are just a small sample of the value at stake.
For the next decade, quantum computing will likely act as a coprocessor for the existing classical implementations of artificial intelligence (AI), which rely on processing very large amounts of data. It could also excel in other applications, such as pattern detection and optimizations, which may be very useful to some AI algorithms.
The road to viability
The technical path for bringing quantum computing to market remains unclear, but several big tech firms and start-ups have set a goal of creating a functional industrial quantum computer in ten years. Some players have set an intermediate milestone of creating an error-corrected quantum computer that fuses superconducting qubits. In a major development, in July 2020 Google established a road map for getting to an error-corrected quantum computer by the end of the decade. About a month after that, a Chinese group proposed a very similar road map; and a month after that, IBM published its almost identical road map.
Multiple approaches are being employed to overcome the two key obstacles for producing an industrial quantum computer: scalability and reliability (noise reduction) of the underlying qubits.
Scalability refers to the challenge of producing interconnected qubits in large quantities. This challenge is supposed to be overcome by the industrialization and standardization of how qubits are produced. Companies can also take advantage of setups that can rely on the existing knowledge of semiconductor electronics—for example, either leveraging silicon-based setups or superconducting qubits.
Reliability, or noise reduction, has been a challenge from the onset of quantum computing because the quantum states need to stay entangled over long periods of time for the quantum computer to operate reliably. Quantum error correction and quantum repeaters have reduced the problem; companies have also investigated topological protection of quantum states (that is, making states noise-free by design).
A sizable commitment of resources
The challenge of bringing quantum computing to market is massive—comparable to the human genome project or landing on the moon. A few guideposts reinforce the daunting scale of the challenge. One estimate is that it takes about 200 scientists and engineers to develop a new silicon-based microprocessor, if the team has experience with the process and draws on 70 years of innovation in the semiconductor industry. By contrast, SpaceX now employs about 10,000 people pursuing the goal of colonizing Mars.
The time and resources it will take to develop a quantum computer falls somewhere between those two examples, because it involves not only producing the quantum processor but also building a supply chain around it. Convening a team of that scale and caliber for an extended period—say, a decade—to build a quantum computer requires deep pockets. That means a limited number of organizations globally can mount an effort with sufficient scale and duration to build a quantum computer.
The capital required to get there also represents a major hurdle. But with enough money, scientists and engineers will eventually be able to overcome the challenges of building a quantum computer—and unlock significant market opportunities.
A demand beacon to focus the effort
To support and possibly accelerate this process, researchers will need an appropriate signal from the market so that resources and talent can be mobilized and directed on an ongoing basis. If the business community were to come together and settle on a few primary use cases—for example, a new battery chemistry or a new approach to carbon sequestration—or a specific problem to solve, it would give researchers a defined target.
One of the greatest feats of innovation, the Apollo mission (the original moon shot), had a clear, tangible goal: getting to the moon. This target focused the attention and talents of smart people on figuring out exactly how powerful a spacecraft would be needed to get there.
A technological feat such as quantum computing feels impossible and just beyond the horizon line—until it is suddenly viable. The progression to the creation of an industrial quantum computer will be marked by a multitude of incremental successes and the persistent focus of resources and talent.
Ivan Ostojic is a partner in McKinsey’s Zurich office, and Rodney Zemmel is a senior partner in the New York office.