Abstract
Combining quantum sensing with quantum computing can lead to quantum computational sensing (QCS) protocols that are able to more efficiently extract task-specific information from physical signals than is possible otherwise. In this paper, we present, in theory and numerical simulations, the application of two quantum algorithms—quantum signal processing and quantum neural networks—to various binary and multiclass machine-learning classification tasks in sensing. Here, sensing operations are interleaved with computing operations, giving rise to nonlinear functions of the sensed signals. Our approach to optimizing QCS protocols takes into account quantum sampling noise and allows us to engineer protocols that can yield accurate results with as few as just a single measurement shot. In all cases, we have been able to show a regime of operation where a quantum computational sensor can achieve higher accuracy than a conventional quantum sensor for a given budget of sensing time, with a simulated accuracy advantage of >20 percentage points for some tasks. We also present protocols for performing nonlinear tasks using Hamiltonian-engineered bosonic systems and quantum signal processing with hybrid qubit-bosonic systems, and empirically show an advantage when the received signal has a limited mean photon number.
To read the full paper, click here.