The quantum computing ecosystem is not ready for quantum error correction (QEC) despite recognising it as critical, according to new research.

QEC is a technique in quantum computing to protect qubit data from errors caused by noise and decoherence, a crucial element in reliable quantum computers.

A survey of 300 quantum professionals conducted by Cambridge-based Riverlane found that 95% of respondents rated QEC as “essential to scaling quantum computing”.

Despite both keen interest in the field and consistent acknowledgement of its importance, a lack of training and skills is commonly reported as the biggest barrier in developing it.

Over a third (41%) cited insufficient training and knowledge as a pain point in QEC, with other common complaints including a lack of clear guidance on best practices and difficulty in securing resources.

“Getting QEC to work on a large scale is an existential challenge for the quantum computing industry,” said a respondent to the survey.

“Making an impact in practical QEC requires an expensive starting investment of both time and money.”

While there have been some efforts to address this, Riverlane suggested it may be insufficient. Around two-thirds of relevant organisations have allocated “moderate resources” in QEC.

Riverlane said: “QEC is essential for achieving fault tolerance and unlocking scalable quantum computing. Yet today, most progress is happening behind closed doors, with limited access for the broader developer community.

“That’s why we’re expanding our QEC capabilities to include enablement and open-source software development, fostering education and elevating the entire quantum community, helping them adopt and implement QEC techniques.”



Register for Free


Bookmark your favorite posts, get daily updates, and enjoy an ad-reduced experience.





Already have an account? Log in