In an industry where timelines are often fuzzy and subject to revision, the quantum-computing company PsiQuantum has set itself an aggressive target. The California-based startup has committed to building a fault-tolerant quantum computer with roughly a million qubits by 2027. And the company is now in the process of assembling its first fully fledged prototype in a warehouse in California. Crossing the million-qubit threshold could start to realize quantum computing’s promise to revolutionize areas like materials science and chemistry.

Founded by a quartet of academics from British universities in 2016, the company has raised US $1.7bn to build an optical quantum computer based on silicon photonics. Using photons to store and manipulate quantum information raises a very different set of challenges compared with those of “matter-based” approaches like superconducting qubits, trapped ions, or neutral atoms, says chief scientific officer and cofounder Pete Shadbolt. But the company is betting that by building on top of mature networking and photonics technology, it can reach scale ahead of its competitors.

Realizing that vision has required PsiQuantum to make breakthroughs in materials science, develop bespoke cryogenics technology and industrialize the production of its photonic chips. But with the key components now in place, the company has started putting them together to build its “Alpha System” at a new facility in Milpitas, Calif., that will officially open later this year.

“This system in California will be orders of magnitude more complex than any system we’ve tested previously,” says Shadbolt. “This is the first time that we’re building a networked system of large numbers of photon sources with real silicon and real [cryogenic] cabinets.”

Planning for scale from the ground up

One of the key factors that differentiates PsiQuantum from its competitors, says Shadbolt, is that the company has focused from the start on building a full-scale, fault-tolerant quantum computer. There was initially hope in the industry that smaller “noisy intermediate-scale quantum” (NISQ) computers could do useful work without error correction, but today there’s growing consensus true usefulness will become possible only with full fault tolerance.

Shadbolt says PsiQauntum operated on this assumption from the start, and this drove its decision to focus on optical approaches. Reaching the millions of qubits necessary to implement error correction at scale requires you to solve four key challenges—cooling, control, connectivity, and manufacturability, he says—all of which are easier with photonics than competing hardware.

Matter-based qubits are highly sensitive to temperature fluctuations and electromagnetic radiation, which means they need to be chilled to near-absolute zero using either dilution refrigerators or laser-based cooling systems. In contrast, photons are resistant to both heat and radiation, which means that in principle they can operate as qubits at room temperature.

In practice, PsiQuantum’s hardware is still kept at cryogenic temperatures. The design relies on superconducting photon detectors that operate between 2 and 4 kelvins, but achieving these temperatures is much easier, says Shadbolt. While the dilution refrigerators required by superconducting qubits can house at most one or two chips, PsiQuantum has designed cryogenic cabinets the size of a server rack that can hold roughly 250. In the company’s new facility, three of these cabinets will be cooled by a cryoplant made by the engineering giant Linde.

A worker looks up at a cryogenic cabinet inside of a warehouse. The cryoplant for PsiQuantum’s upcoming prototype quantum computer.Colby Macri/PsiQuantum

Photons’ resistance to heat and radiation also makes it possible to pack control electronics close to the qubits, says Shadbolt, something that is proving much harder for matter-based approaches. And crucially, photons can be transmitted over standard telecom fiber, which makes networking chips together much simpler. The company recently demonstrated the ability to transmit qubits over 250 meters of fiber with 99.7 percent fidelity, says Shadbolt.

Getting ready for mass manufacturing

However, says Shadbolt, one of the biggest challenges for building a large-scale quantum computer is manufacturing. Most quantum computers are bespoke devices. But by building on top of mature silicon-photonics technology, PsiQuantum has been able to create a commercial fabrication process for its chips, detailed in a Nature paper earlier this year, in collaboration with Global Foundries.

“The key insight that we were founded on is that you can’t change the semiconductor industry very much,” says Shadbolt. “If you want to make millions of devices at a high level of maturity, you need to leverage the trillion dollars in 50 years that’s gone into the semiconductor industry.”

Getting the company’s chips production-ready wasn’t simple, as it incorporated novel designs and materials, including superconducting photon detectors and ultrafast optical switches. But Global Foundries is now churning out thousands of PsiQuantum’s chips at a commercial semiconductor fab in Malta, N.Y.

While all the components have been independently tested, the system under construction in Milpitas will be the first true test of the company’s overall architecture. Shadbolt says he hopes to have the system cold by the end of the year, ready to start experiments by early 2026.

Crucially, these experiments will not involve running quantum algorithms, says Mercedes Gimeno-Segovia, VP for system architecture. Companies like Google and IBM have used smaller prototypes to demonstrate quantum supremacy on toy problems, but Gimeno-Segovia says NISQ machines behave so differently from fault-tolerant ones that these kinds of experiments provide little insight. Instead, PsiQuantum’s Alpha System is designed to test whether the behavior of the system matches the predictions made by the company’s models, which will be crucial for designing future systems.

“We’re not trying to impress anybody, frankly,” she adds. “What we’re trying to do is say, Do we understand the system that we’re building? And the telltale fact that tells us whether we do or not, is whether we can predict this behavior.”

The trouble of flighty photons

Simon Devitt, research director at the Centre for Quantum Software and Information at the University of Technology, Sydney, thinks PsiQuantum’s focus on skipping the NISQ regime and jumping straight to full fault tolerance is a good approach. But he points out that the company had little choice.

PsiQuantum’s system relies on linear optics, where photon generation is inherently nondeterministic, says Devitt, which means gate operations fail roughly 25 to 50 percent of the time. PsiQuantum has come up with clever ways to reduce this number by running many photon-generation attempts and then picking out successful ones, something known as multiplexing. But this only partly solves the problem, and remaining gate failures must be dealt with by error correction.

Work stations, each equipped with computers and servers, inside of a warehouse. The inside of PsiQuantum’s new facility in Milpitas, Calif.Colby Macri/PsiQuantum

This means it’s essentially impossible to run quantum algorithms on the hardware until fault tolerance is achieved, says Devitt. More importantly, he adds, it means a huge amount of the error-correction budget is used up fixing these gate failures. This leaves little leeway for other sources of error, such as detector inefficiencies or optical losses from coupling chips to fiber.

“Photons are extremely easy to lose,” says Devitt. “So that’s really where a lot of the questions arise, as to whether or not they can get their devices working to the required accuracies so that they don’t overwhelm the error correction.”

Optical loss is the biggest source of errors in PsiQuantum’s system after gate errors, says Devitt. Efforts to reduce this hinge on three key components—waveguides, photon detectors, and optical switches. Based on the data published in the company’s recent Nature paper or shared on X, he says the first two appear ready, but losses on the company’s switches are still too high. “The question is, is it a material-science limitation they’re hitting?” he adds. “Or is it just about purity and fabrication?”

Shadbolt is confident it’s the latter. He sees no major hurdles; instead, it’s going to take thousands of small, incremental improvements to the design and geometry of the chips, alongside tweaks to the fabrication process. Optical loss is heavily impacted by the precision with which you can build components, Shadbolt adds, so being able to lean on Global Foundries’ world-leading tools and processes gives them a major advantage.

“It’s very challenging to make these components with good enough performance, but we have a great track record of improving performance,” he says. “We know the steps that we’re going to take to close the remaining gap, and we have high confidence in our ability to do that.”

Paul Smith-Goodson, principal analyst at Moor Insights & Strategy, thinks PsiQuantum has a realistic chance of meeting its lofty goals. While it still has a way to go in cutting losses, and a huge job ahead when it comes to integrating all these components at scale, he thinks the company is on track.

For him, the bigger challenge may be financial rather than technical. Despite the massive sums the company has raised, this will cover only a few prototypes, and they will need to raise significantly more to build a full-scale machine. “It takes a lot of money to do what they’re doing,” he says.

From Your Site Articles

Related Articles Around the Web