Assessing Quantum Advantage for Gaussian Process Regression

Gaussian Process Regression is a well-known machine learning technique for which several quantum algorithms have been proposed. We show here that in a wide range of scenarios these algorithms show no exponential speedup. We achieve this by rigorously proving that the condition number of a kernel matrix scales at least linearly with the matrix size under general assumptions on the data and kernel. We additionally prove that the sparsity and Frobenius norm of a kernel matrix scale linearly under similar assumptions. The implications for the quantum algorithms runtime are independent of the complexity of loading classical data on a quantum computer and also apply to dequantised algorithms. We supplement our theoretical analysis with numerical verification for popular kernels in machine learning.
View on arXiv@article{lowe2025_2505.22502, title={ Assessing Quantum Advantage for Gaussian Process Regression }, author={ Dominic Lowe and M.S. Kim and Roberto Bondesan }, journal={arXiv preprint arXiv:2505.22502}, year={ 2025 } }