Key findings
Today’s quantum platforms need to radically scale-up to deliver on quantum computing’s true commercial promise. Innovation to meet this challenge continues to throw focus onto new parts of the quantum stack. Quantum networking technology is much more central to this quest than is commonly realized. Forward thinking quantum players, ecosystem builders and investors are already charting their strategies to maximize their share of this prize.
Summary
The best of today’s early quantum systems are already capable of notional ‘beyond classical’ calculations. This starts to be possible with a high fidelity 50-60 qubit module where environmental noise and crosstalk are sufficiently under control. However, whether such Noisy Intermediate Scale Quantum modules will be capable of broad commercial utility is much less clear. With 200-300 physical qubits and advanced techniques for error suppression and error mitigation, very interesting science experiments certainly seem possible, but commercial applications with a strong quantum advantage may be more difficult to achieve.
The applications of quantum computing where our understanding of the likely benefits are most established, (cryptanalysis, materials science, quantum chemistry) often required large systems of 1000Q - 10,000Q+ high performing qubits, able to sustain quantum calculations without error over a trillion plus quantum operations - the terra quops regime.
To reach this level of performance most believe that techniques for quantum error correction will also be required to map multiple physical qubits onto higher performing logical qubits. The catch is that these techniques introduce significant overheads. Though impacted by many factors, the most widely established techniques might require an eye-watering ratio of 1500:1 or more physical to logical qubits. Popular summaries typically conclude that we will need ‘millions’ of physical qubits to serve large scale quantum applications.
Proposals exist to reduce the overheads of error correction, but often depend on significantly more demanding qubit properties: not just much higher raw physical fidelity, but also in particular higher and non-local inter-qubit connectivity during the error correction cycle. Even if we assume relatively aggressive assumptions, a module size of 10,000 physical qubits and an error correction overhead of 20:1, that is still not enough to serve large applications in a single module.
Many hardware strategies are possible to enhance scalability: reduced physical qubit error rates, enhanced module sizes, higher-rate error correcting codes. However, GQI believes that for all of today’s proposed quantum computing architectures a modular approach to scaling will also ultimately be required. For most architectures that will ultimately mean a desire to leverage interconnects, operating at either microwave or optical frequencies. Such strategies come with a close connection to the wider field of quantum communications and quantum networking. If sufficiently performant optical photonic interconnects can be achieved, they hold out the promise of unique synergies and flexibilities across these fields. As a bonus, modular scaling comes with significant additional advantages: flexibility, maintainability and redundancy.
GQI concludes that it is important for all quantum sector participants to take a holistic view in appraising quantum computing architectures. It is a mistake to view the performance of a qubit system in isolation from the practical module sizes it can support; to neglect the interconnect technologies with which it is compatible; to fail to account for the quantum error correcting codes and pathways to fault tolerance the configuration can enable; to miss the complexity of the high performance classical control system this may require; to fail to anticipate how the system will ultimately be embedded within what most expect to be a complex, hybrid HPC data center environment.
Efficiently orchestrating a large modular quantum hardware system will itself be a key challenge across multiple disciplines: our understanding of the physics, physical engineering and software engineering. Academic work has often addressed these issues under the label ‘distributed quantum computing’ or as an aspect of ‘entanglement distribution’’. Commercial hardware players refer to systems engineering and modular scaling. Would-be midstack software/firmware providers talk about operation systems and software stack abstraction. GQI feels these converging discussions have sometimes obscured the centrality of the question of quantum networking technology, at least at short ranges, to the delivery of mainstream quantum computing. GQI believes that many early quantum computing players have often not explained this part of the roadmap in sufficient detail. Major challenges often lurk in these gaps.
GQI believes that challenge also brings opportunity. New startups are enriching the ‘network layers’ of the quantum computing stack with innovation. Opportunities exist to capture important heights in the quantum computing value chain. New business models are available for investors to back.
top of page
$2,650.00Price
bottom of page