top of page

We do now have real quantum applications, though these are currently modest compared to our expectations. As early adopter interest grows, how are the prospects for commercially relevant algorithms shaping up? We examine powerful challenges, and also powerful new opportunities.


The strategic algorithms playing field


Quantum computers are set to be much slower than conventional computers when measured on just their clock speed. The real advantage they offer comes from the unique quantum algorithms they promise to run. Much hype has surrounded the claims and expectations for what such algorithms can achieve. 


Some quantum algorithms offer remarkable (exponential) speedups, others more modest (quadratic) speedups. Some are heuristics whose benefits must be evaluated by trial and error.


It is not possible to divorce algorithms from the specs of the hardware on which they are intended to run. Academics have been developing quantum algorithms for 30 years. Early quantum hardware varied from non-existent to very poor. However, it was known that beyond a certain fidelity threshold gate-model architectures could in principle use quantum error correction to turn noisy physical qubits into better logical qubits.  It was therefore natural for algorithm theorists to focus on the abstraction of fault tolerant quantum computation (FTQC). 


The most famous gate-model quantum primitives all date from this era: the quantum fourier transform, (QFT) used, for example, in Shor’s algorithm for factoring, quantum phase estimation (QPE) used, for example, in the HHL algorithm for linear algebra and in quantum simulation; amplitude amplification (AA) used, for example, in Grover’s algorithm for unstructured search and discrete time quantum walks for Monte Carlo. 


QFT and QPE typically offer an exponential advantage though sometimes with important caveats. AE typically offers a quadratic advantage, also typically with caveats. These primitives typically require implementation on the high-depth quantum circuits implicitly provided by FTQC.


More recently it has been shown that all of these primitives can be understood as arising from one unified framework, a grand unified quantum primitive – the quantum singular value transformation (QSVT).


When physical devices improved to the point that they were able to demonstrate they had some ‘beyond classical’ capabilities, a new focus naturally developed: what might we be able to do directly with these noisy intermediate scale quantum (NISQ) devices?


For gate-model approaches limited fidelities mean that high-depth quantum circuits are not feasible. Variational quantum algorithms (VQAs) have emerged as the most promising body of techniques aimed at NISQ devices, combining low-depth quantum circuits with classical processing in an iterative loop. 


Quantum annealing (QA) can be seen as a NISQ era implementation of adiabatic quantum computing, an alternative to gate-model computation.


These techniques are heuristics. In contrast to the full scale techniques described above, we have no guarantee of their accuracy, performance or the speedup they may offer.


As developer roadmaps now also increasingly project forward to the introduction of fault tolerant techniques, it is increasingly clear that early FTQC devices still will not offer the specs to run all full-scale FTQC algorithms. A focus on algorithms suitable for this new middle ground is an increasingly important focus.


A low-depth algorithm breakthrough could still yield broad NISQ quantum advantage. However without further progress on algorithms the alternative could be a hard quantum winter. Making quantum software startup financial plans robust to both these scenarios is a challenge.


Cryptographic applications may provide an early opportunity in quantum enabled networks. Some see helping to join up a future entanglement based Quantum Internet as the biggest opportunity of all. 


For some quantum algorithm players, a presence in FQQC may be a stepping stone to wider networked enabled applications.


Key challenges


Today’s quantum hardware does not yet close the gap with the specs that quantum algorithm developers want to target. Even with currently anticipated hardware improvements, progress on algorithms is still needed to bring them into alignment with real world use cases on the timescales the sector, and its investors, commonly assume. Such work is being very actively pursued.


Looking across recent progress, GQI believes the key questions fall along important time horizons:

  • 2-5 years – what is realistically achievable with NISQ era hardware?
    Optimism is reasonable for specific use cases; but broad quantum advantage looks challenging.
  • 5-10 years – what might be addressed with early FTQC devices?
    Proven applications are more modest than we might hope; but this is a relatively new segment. We can expect further innovation. Quantum resource estimation is key.
  • In the long term – is the hype around quantum computing justified?
    GQI continues to believe that the long-term prognosis is bright.

Outlook Report | Quantum Algorithms '23

  • The GQI Outlook series is the predominant and authoritative, annual overview of the quantum tech space.

    Provided across quantum applications, outlooks analyze computing hardware and software, algorithms and other areas with an in-depth look at the technologies, products, vendors - and the scientific current state driving these.

    Anyone interested in quantum tech, will not find a more detailed or valuable analysis of the state than the GQI Outlooks.

bottom of page