Last month I had the opportunity to moderate two different panels at the Chicago Quantum Forum. I’m far from an expert in quantum but have been spending more time in the space, given all the connections between semiconductors, AI, and quantum.
The first conversation was a fireside chat with Pat Gelsinger, formerly CEO of Intel and now a partner at Playground Global, the venture capital firm. Pat argued it was best to see quantum in the context of a “trinity of compute”—classical, AI accelerated, and next quantum, all of which will be coordinated and compound advances in the other. I find that many people outside the industry conceptualize quantum as a replacement for classical computing, but Pat’s right that it’s really a complement. That’s why big tech firms like Microsoft, Google, IBM, and Amazon are all investing in it—they want to offer quantum alongside more traditional computing services.
The second conversation was with Om Namalasu, CTO of Applied Materials, and Steve Brierly, CEO of Riverlane, the quantum error correction firm. Neither company is building quantum computers, but they’re both enabling them, via either better materials or better error correction capabilities.
This spoke to an important dynamic: a quantum supply chain is emerging, with companies focusing on different segments of the quantum computing stack. Steve and I spent the last couple of weeks discussing ways in which the quantum supply chain will and won’t look similar to the semiconductor supply chain. We put pen to paper below:
If Isaac Newton’s discoveries required standing on the shoulders of giants, today’s progress in quantum computing capabilities stands on the shoulders not only of giant technological advances but giant supply chains, too. We’re still in the very early days of the quantum industry, but we can already draw some tentative conclusions about the shape of the quantum supply chain. It’s not only deeply intertwined with the classical computing and semiconductor supply chain, drawing on similar materials and manufacturing processes as well as high performance computing capabilities. The decades-long development of the semiconductor supply chain provides an analogy for how abstraction layers and specialized supply chain segments are likely to emerge in quantum.
From the invention of the transistor in 1947 to the first integrated circuits in 1958 and 1959, the semiconductor industry has benefited from over 75 years of specialization and abstraction. In the early days, chipmakers had to do everything themselves. Bob Noyce, one of the inventors of the integrated circuit and founder of Fairchild Semiconductor, spent the late 1950s driving across the Bay Area, visiting camera shops in his search for lenses for homemade photolithography machines. When, around the same time, TSMC founder Morris Chang was still an assembly line supervisor at Texas Instruments, the company produced its own silicon wafers and other materials.
It was only a decade or two later that firms selling specialized chipmaking equipment emerged. Applied Materials–which initially started selling chemicals before pivoting to chipmaking tools–was founded in 1967. Other market leaders came later. KLA was founded in 1975. Lam Research was established in 1980. And Phillips Electronics didn’t spin off its in-house lithography tool division–now ASML–unit 1984.
This specialization, which also occurred in the materials space, has been critical to technological advances. It’s hard enough today for TSMC to make the world’s most advanced chips. If it also had to produce the world’s most specialized photoresists, the world’s most purified silicon wafers, and the world’s most advanced lithography, deposition, etch, and metrology equipment, it’s pretty clear the pace of progress would be far slower.
Each leap in supply chain specialization created a new abstraction layer. Chip manufacturers no longer needed to know how to make lithography tools, they only needed to know how to use them. This allowed intensified specialization and increased rates of progress.
We’re already seeing this in the quantum space. It’s partly enabled by specialization that already exists thanks to the semiconductor supply chain. Quantum companies don’t need to reinvent many materials or equipment that can be bought off-the-shelf from existing suppliers, often in the semiconductor industry. And specialized suppliers are already emerging in, for example, the systems needed to produce the ultra-cold temperatures that certain qubit technologies require.
There were two other big supply chain shifts in the chip industry that corresponded with the emergence of new abstraction layers. The first was the emergence of independent chip design software vendors like Cadence and Synopysys. When Intel was founded in 1968, chips were still designed by hand, with the designs physically cut using a knife in a plastic-like material called rubylith, which was then used to produce masks for photolithography. It was possible to produce chips with a thousand hand-cut transistors, but it would be inconceivable to use manual design for today’s chips with over a billion transistors.
That’s why the emergence of a new abstraction layer–with chip design software turning chip design into a process that today is more like coding than traditional circuit design–was so important. It led to another key abstraction, the splitting of chip design from chip manufacturing and the rise of the fablesss-foundry model. By the 1980s–already three decades after the first integrated circuits–companies like TSMC emerged to specialize solely in manufacturing, while others like Qualcomm and Nvidia were founded with the exclusive aim of designing chips. This abstraction enabled a step-change in specialization, driving further technological progress.
We’re not at that moment yet with quantum, but you can already begin to see its outlines. Steve’s company Riverlane, for example, focuses exclusively on error correction. Other companies like Phasecraft and QCWare devise more efficient quantum algorithms. Quantum Foundry is developing a qubit fabrication capability and Emergence Quantum are developing scalable qubit control systems. The big tech and cloud computing firms have a vision not of selling a quantum computer for every office but of providing quantum capabilities that are accessible for users who may never actually see one in person–not all that different from today’s cloud computing paradigm. All of these companies share a common theme, tackling a specific part of the supply chain that is sufficiently common across different qubit modalities.
Looking at the quantum supply chain today, several things stand out. The overlap with the existing classical computing stack–from the materials up through the cloud computing model–is one. Second is how much more differentiated and specialized the quantum supply chain is at the current stage of development, in contrast to the classical and semiconductor supply chain, which took longer to break out into separate steps. The fact that we’re talking about a quantum supply chain before we have a commercially useful quantum computer is pretty extraordinary.
One key reason is that, whereas it took time for the classical computing industry to use computers to accelerate progress (eg in design or metrology), quantum is already benefiting from today’s supercomputing capabilities. It was scarcely possible to use 1960s computers to design new chips or computers because their capabilities were so limited. The emergence of specialized chip design software was only really possible after the industry had matured for several decades.
Quantum doesn’t have to wait, because existing high performance compute and AI capabilities are already being deployed to accelerate quantum. The next question is how quickly quantum capabilities will emerge that further accelerate the flywheel. It’s easy to imagine that, in a couple of years, quantum computing will accelerate materials science in a way that catalyzes even better quantum capabilities.
What lessons are implied for companies, investors, or policymakers focused on the quantum space? One is the challenge of creating clear interfaces that unlock the creation of a supply chain. Interfaces between companies emerge out of commercial or technical necessity to reduce risk, cost, or time to market. Programs such as Darpa’s Quantum Benchmarking Initiative (QBI) have accelerated the definition of interfaces in quantum by requiring performers to set out in detail how they are going to reach utility scale. Notably, Darpa did much the same thing in helping develop the chip industry.
A second observation is that it seems highly unlikely that a single country will own the entire quantum computing supply chain. It’s just far too complex a problem and with limited talent pools, the race to utility scale quantum computing will require collaboration, at least among allies. Yet today, most governments have essentially the same quantum strategy: to develop and exploit national capabilities in quantum computing. We’ve yet to see intentional plays to win valuable parts of the supply chain based on national strengths, as Tawain so successfully did with TSMC, the Netherlands with ASML, or the UK with ARM.
Third is the long time horizons over which deep scientific research translates into practical application. The invention of the transistor was famously reported on page 46 of the New York Times. Lithography techniques were honed by engineers at places like Bell Labs for decades before they produced hundred-billion-dollar companies. Venture capital firms have focused too much on the sugar rush of SaaS startups rather than the longer term–but much more technologically and economically transformative–computing technologies that require patience.
Governments and academic institutions must be patient, too, in their investments in deep science and technological development to build the foundation–and the supply chains–on which giant leaps forward depend.
Read more here.