Tech News
← Back to articles

Bootstrapping the Future: From Resource Estimation to Quantum Advantage

read original related products more articles

Quantum Algorithms That Work

A central problem in quantum computing is the development of domain-specific algorithms that enable real scientific and economic contributions. Quantum algorithms in numerous domains such as finance, chemistry, physics, and biology are being developed with the intention of achieving quantum advantage, i.e. to derive deeper insights relative to established algorithms on classical machines. But these algorithms are typically evaluated through simulation and/or testing on small devices (e.g. < 100 qubits), calling into question the robustness of their affordances as quantum systems scale. System-level challenges have traditionally complicated the economic potential of quantum computing, but significant advances in system fault-tolerance, qubit numbers, and ecosystem standardization show tremendous promise. Developments in this realm have received the lion’s share of media attention and are likely driving the investing surge into quantum technology. Nevertheless, following the influx of capital and drive to deploy intermediate quantum systems, the technology’s ultimate value will be determined by its real-world utility, enabled by domain-specific algorithms that scale alongside the systems that host them.

The Prototype Paradox

Effective domain-specific algorithms face a significant challenge: Researchers need to design for a system architecture that has yet to mature, in a manner that outperforms all existing classical solutions. With the anticipated deployment of quantum cloud computing, on-device testing to validate performance should enable more experimentation and faster validation. That being said, near-term quantum computers, regardless of physical modality, operate within a paradigm of error mitigation (as opposed to elimination), and utilize extensive corrective measures such as noise suppression and cancellation techniques, circuit knitting, and variational algorithms. This ensemble of approaches entails a hybrid quantum-classical computing system and imposes a significant amount of computational overhead. Performance on these near-term systems therefore bears little resemblance to that of future fault-tolerant systems, which are expected to have orders of magnitude greater capacity. Quantum simulation, as an alternative to on-device testing, is similarly bottlenecked by limited near-term quantum and classical computing capacity. Algorithms optimized for near-term noisy devices and their computationally efficient methods may be ill-suited to large, fault-tolerant systems, and emerging algorithms that appear untenable on current hardware may be ignored in lieu of classical solutions. These limitations may lead to a scenario in which fault-tolerant hardware is developed, but a lack of corresponding advancement in quantum software delays real-world achievements and punishes investors.

Resource Estimation

To address this issue, Resource Estimation (RE) has been proposed as a complementary method to on-device testing and simulation that can indicate the physical and logical requirements of quantum algorithms, particularly those that exceed the capacity of current hardware. While RE can characterize requirements for near-term noisy quantum computers, it can more importantly provide estimates for future fault-tolerant systems. Notable resource estimation tools include the Azure Quantum Resource Estimator, Google Quantum Algorithm Translator, Zapata BenchQ, and MIT pyLIQTR. Roughly speaking, these tools work in the following manner:

Use quantum instruction set architectures to translate high-level code into a logical representation. Transform the logical representation into a low-level physical representation. Estimate runtime under specific parameters such as the number of T gates and corresponding T factories (a bottleneck in quantum architecture), and the number of logical qubits and corresponding physical qubits.

These parameters describe the circuit in terms of its depth (gates/factories) and width (qubits), as in circuit complexity theory, with the added abstraction of logical to physical qubits to account for the cost of error correction. These estimates can provide stakeholders with a better understanding of how quantum computing will actually perform by 2030 and beyond.

Limitations

While RE is certainly useful, the methodology currently has two aspects that limit its generalizability and predictive power:

... continue reading