Curator's Take
This research reveals a previously underappreciated threat to quantum networks that could explain why some experimental systems fall short of theoretical predictions. The discovery of "hyperloss" - where seemingly minor spatial mode mismatches can completely destroy quantum advantages by converting squeezed light into thermal noise - challenges fundamental assumptions in quantum network design. What makes this particularly significant is that the effect is controllable: the researchers demonstrated they could not only eliminate the hyperloss but actually turn mode mismatches into an advantage, reducing effective loss from 15% to just 2.8%. This finding could be crucial for scaling up quantum sensing networks, gravitational wave detectors, and distributed quantum computing systems where maintaining quantum correlations across multiple nodes remains a major challenge.
— Mark Eatherly
Summary
Quantum-correlated networks distribute quantum resources such as squeezed and entangled states. These states are central to modern quantum technology, including photonic quantum computing, quantum communications, non-destructive biological sensing and gravitational-wave detection. Even for squeezed states of light - the most robust quantum-correlated resource - loss-induced decoherence remains the dominant obstacle to strong quantum advantage in in large-scale interferometric and networked quantum systems. Common design assumption in these applications is treating mismatches between spatial modes as a small, incoherent loss. Here we show that this picture can fail: coherent spatial-mode mixing with higher-order spatial modes can produce an apparent loss exceeding 100% relative to the initial squeezing, a regime we term hyperloss. We experimentally demonstrate hyperloss in a minimal two-node quantum network: with only 8% mode mismatch, a 5.8dB squeezed state is converted into an effectively thermal state with no quadrature squeezing, eliminating the quantum advantage. Because the effect is coherent, it is controllable: lost correlations can be recovered by tuning differential spatial-mode phases (e.g., Gouy-/propagation-phase). We demonstrate this recovery experimentally, not only eliminating the hyperloss, but even significantly suppressing the mode mismatch loss, with 15% geometric mismatch acting like only ~2.8% effective loss. Hyperloss is a design-limiting mechanism for all quantum networks with squeezed light, from from photonic quantum processors to large-scale interferometers and distributed quantum-sensing networks. Our results provide a practical route to avoid hyperloss and turn mode mismatch into an explicit, phase-aware design parameter for future quantum technologies.