Quantum computing may still seem like the stuff of science fiction, but it is quietly moving from theoretical promise to practical exploration, particularly within high-performance computing (HPC) environments.
In this feature, Owen Thomas, CEO and senior partner at Red Oak Consulting, explores the emerging convergence between HPC and Quantum Processing Units (QPUs), outlining both the opportunities and operational challenges of building hybrid systems. From orchestration and standards to skill gaps and strategic planning, he offers a grounded, long-term perspective on how these once-separate fields are beginning to align, and why now is the time for enterprise and public sector leaders to start preparing.
– – – – – –
In computational science, speed has always mattered, but so have precision, scalability, and adaptability. High-Performance Computing (HPC) has long underpinned our most complex simulations and models, from weather forecasting to aerospace design. Now, another form of computing is emerging at the edge of this domain: quantum. Though often confined to academic speculation or tech press hyperbole, quantum computing is beginning to find a foothold through a measured, deliberate convergence with HPC infrastructure.
The integration of Quantum Processing Units (QPUs) into HPC environments remains in its infancy. But the early signs are promising. At this year’s ISC High Performance conference[1], a notable shift could be felt. Conversations around quantum were no longer siloed; they were integrated into broader discussions about hybrid computing, data workflows, and the future of scientific infrastructure.
Quantum’s Niche but Vital Role
Quantum computing is not a replacement for HPC. Rather, it is an augmentation – highly specialised, still experimental, but with the potential to solve certain classes of problems exponentially faster than classical systems. These include quantum chemistry simulations, complex optimisation challenges, and problems rooted in probabilistic reasoning, such as Monte Carlo simulations or financial risk modelling.
These domains have always posed challenges for traditional HPC, not for lack of brute force, but due to the complexity of the problem space. Quantum’s value lies in its different model of computation – superposition, entanglement, and interference, allowing it to explore multiple states simultaneously and converge on solutions that might elude even the most powerful classical supercomputers.
From Simulation to Integration
Much of the current integration takes place through simulation. QPUs remain relatively rare and fragile and can be highly sensitive to noise, expensive to operate, and still constrained in the number of qubits available. Consequently, many hybrid workflows are designed to simulate quantum processes on HPC systems, using quantum-inspired algorithms and emulators.
But a shift is underway. Work is now being done to orchestrate real-time workflows that span both HPC and quantum architectures, where an HPC scheduler might offload certain parts of a workload to a quantum accelerator, much like it would to a GPU. This model requires new orchestration layers, middleware, and compilers, none of which exist at scale yet. But the effort to build them has begun.
Bridging Two Worlds
Quantum computing has largely been driven by physicists, mathematicians, and algorithm designers. HPC, on the other hand, is shaped by engineers, system architects, and large-scale infrastructure specialists. These disciplines speak different languages. But increasingly, they need each other.
At ISC, quantum professionals repeatedly emphasised the value of HPC expertise, particularly in workload orchestration, energy efficiency, and scaling infrastructure. The reality is that quantum systems will not function in isolation; they will need to be integrated into data centres, cloud platforms, and edge networks. And HPC veterans have spent decades solving those very problems.
The Skills and Software Gap
That convergence, however, is slowed by a significant skills gap. Developers trained in quantum algorithms are scarce, and those who can bridge both quantum and HPC domains are rarer still. Software toolchains are fragmented, vendor-specific, and often immature. Common standards are emerging, such as QIR, OpenQASM, and hybrid quantum-classical frameworks, but these are not yet fully embedded in enterprise or public sector workflows.
This gap represents both a challenge and an opportunity. At Red Oak Consulting, we recognise that for quantum to become operationally useful, time and investment are needed, not just in hardware, but in education, tooling, and cross-disciplinary collaboration.
A Long-term Strategic Play
It is tempting to view quantum computing as the next wave in technological disruption, akin to the rise of artificial intelligence (AI). But while AI has moved from research to real-world deployment at unprecedented speed, quantum follows a slower trajectory. Its impact will be narrower in the near term, but potentially more transformative in the long run.
Unlike AI, which benefits from ever-increasing data volumes and iterative training, quantum computing is constrained by the pace of physical breakthroughs. Qubits do not scale as easily as GPUs. Error correction remains a monumental task. As such, quantum must be treated as a strategic horizon, not a short-term race.
For many organisations, the smart approach is to begin now: explore quantum use cases, invest in hybrid infrastructure, and bring HPC and quantum teams into closer alignment. Those who wait until quantum is ‘ready’ may find themselves years behind when it finally is.
The Future is Hybrid
The integration of QPUs within HPC is not about replacing one system with another; it’s about building a new class of computing capability, where different architectures work in concert. The future of performance computing will not be defined by single breakthroughs, but by the careful layering of technologies, each playing to its strengths.
In this landscape, Red Oak Consulting continues to advise clients on preparing for hybrid futures, helping them assess infrastructure needs, workforce readiness, and long-term strategy. Because while quantum computing may not be today’s revolution, it is almost certainly tomorrow’s.
[1] https://isc-hpc.com/