top of page

Today’s quantum platforms need to radically scale-up to deliver on quantum computing’s true commercial promise. Innovation to meet this challenge continues to throw focus onto new parts of the quantum stack. Quantum networking technology is closely connected to modularity and so much more central to this quest than is commonly realized. Forward thinking quantum players, ecosystem builders and investors are already charting their strategies to maximize their share of this prize.


The best of today’s early quantum systems are already capable of notional ‘beyond classical’ calculations. This starts to be possible with a high fidelity 50-60 qubit module where environmental noise and crosstalk are sufficiently under control. However, whether such Noisy Intermediate Scale Quantum (NISQ) modules will be capable of broad commercial utility is much less clear. With 200-300 physical qubits and advanced techniques for error suppression and error mitigation, very interesting science experiments certainly seem possible, but commercial applications with a strong genuinely quantum advantage may be more difficult to achieve.

The applications of quantum computing where our understanding of the likely benefits are most established, (cryptanalysis, materials science, quantum chemistry) often envisage large systems of 1000Q - 25,000Q+ high performing qubits, able to sustain quantum calculations without error over a trillion plus quantum operations - the terra quop regime.


To reach this level of performance most believe that techniques for quantum error correction (QEC) will also be required to map multiple physical qubits onto higher performing logical qubits. The catch is that these techniques introduce significant overheads. Though impacted by many factors, the most widely established techniques might require an eye-watering ratio of 1500:1 physical to logical qubits (or more). Popular summaries typically conclude that we will need ‘millions’ of physical qubits to serve large scale quantum applications.


Proposals exist to reduce the overheads of error correction, but often depend on significantly more demanding qubit properties: not just much higher raw physical fidelity, but also higher and non-local inter-qubit connectivity during the error correction cycle. Even if we assume relatively aggressive assumptions, a module size of 10,000 physical qubits and an error correction overhead of 20:1, that is still not enough to serve large applications in a single module. 


GQI believes that for almost all of today’s proposed quantum computing architectures a modular approach to scaling will also ultimately be required. This will likely entail a distributed rather than a monolithic quantum computing stack. As a bonus, modular scaling comes with significant additional advantages: flexibility, maintainability and redundancy.


Most architectures will ultimately need to leverage interconnects, operating at either microwave or optical frequencies. Such strategies come with a close connection to the wider field of quantum communications and quantum networking. If sufficiently performant optical photonic interconnects can be achieved, they hold out the promise of unique synergies and flexibilities across these fields. 


GQI concludes that it is important for all quantum sector participants to take a more holistic view in appraising quantum computing hardware architectures than has often been the case. 


The fundamental requirement remains:

  • a high fidelity qubit platform


But we also need to maintain a focus on:

  • practical and optimal module sizes 

  • qubit connectivity and interconnect compatibility

  • the QEC codes and pathways to fault tolerance these configurations enable

  • classical control system performance (particularly real time decoding)

  • how the system will ultimately fit within a hybrid HPC data center environment.


Efficiently orchestrating such large, modular quantum hardware systems will itself be a key challenge across multiple disciplines: our understanding of the physics, physical engineering and software engineering. Academic work has often addressed these issues under the label ‘distributed quantum computing’, or ‘entanglement distribution’. Commercial hardware players refer to systems engineering and modular scaling. Would-be midstack software/firmware providers talk of operation systems and software stack abstraction. 


GQI feels these converging discussions have sometimes obscured the centrality of the question of quantum networking technology, at least at short ranges, to the delivery of mainstream quantum computing. GQI believes that many early quantum computing players have often not explained this part of the roadmap in sufficient detail. Major challenges often lurk in these gaps.


GQI believes that challenge also brings opportunity. New startups are enriching the ‘network layers’ of the quantum computing stack with innovation. Opportunities exist to capture important heights in the quantum computing value chain. New business models are available for investors to back.

Outlook Report | Scalable Quantum Hardware '24

  • The GQI Outlook series is the predominant and authoritative, annual overview of the quantum tech space.

    Provided across quantum applications, outlooks analyze computing hardware and software, algorithms and other areas with an in-depth look at the technologies, products, vendors - and the scientific current state driving these.

    Anyone interested in quantum tech, will not find a more detailed or valuable analysis of the state than the GQI Outlooks.

bottom of page