hardware algorithms machine_learning

A hardware efficient quantum residual neural network without post-selection

Curator's Take

This research tackles one of the most pressing challenges in quantum machine learning by developing a quantum neural network that achieves comparable performance to existing models while using 10 times fewer quantum gates. The breakthrough lies in implementing residual connections without post-selection, a technique that eliminates the need to discard failed measurement outcomes that typically plague quantum algorithms and make them inefficient on real hardware. By demonstrating strong performance on standard image classification benchmarks while avoiding barren plateaus during training, this work provides a practical pathway for deploying quantum machine learning on the noisy, resource-constrained quantum processors available today. The combination of hardware efficiency, trainability, and adversarial robustness makes this a significant step toward quantum machine learning models that could actually run effectively on near-term quantum devices.

— Mark Eatherly

Summary

We propose a hardware efficient quantum residual neural network which implements residual connections through a deterministic linear combination of identity and variational unitaries, enabling fully differentiable training. In contrast to the previous implementation of residual connections, our architecture avoids post-selection while preserving residual learning. Furthermore, we establish trainability of our model, mitigating barren plateaus which are considered as a major limitation of variational quantum learning models. In order to show the working of our model, we report its application to image classification tasks by training it for MNIST, CIFAR, and SARFish datasets, achieving accuracies of 99% and 80% for binary and multi-class classifications, respectively. These accuracies are comparable to previously achieved from the standard variational models, however our model requires 10x fewer gates making it better suited for resource constraint near-term quantum processors. In addition to high accuracies, the proposed architecture also demonstrates adversarial robustness which is another desirable parameter for quantum machine learning models. Overall our architecture offers a new pathway for developing accurate, robust, trainable and hardware efficient quantum machine learning models.