15
43

Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization

Abstract

Matrix square roots and their inverses arise frequently in machine learning, e.g., when sampling from high-dimensional Gaussians N(0,K)\mathcal{N}(\mathbf 0, \mathbf K) or whitening a vector b\mathbf b against covariance matrix K\mathbf K. While existing methods typically require O(N3)O(N^3) computation, we introduce a highly-efficient quadratic-time algorithm for computing K1/2b\mathbf K^{1/2} \mathbf b, K1/2b\mathbf K^{-1/2} \mathbf b, and their derivatives through matrix-vector multiplication (MVMs). Our method combines Krylov subspace methods with a rational approximation and typically achieves 44 decimal places of accuracy with fewer than 100100 MVMs. Moreover, the backward pass requires little additional computation. We demonstrate our method's applicability on matrices as large as 50, ⁣000×50, ⁣00050,\!000 \times 50,\!000 - well beyond traditional methods - with little approximation error. Applying this increased scalability to variational Gaussian processes, Bayesian optimization, and Gibbs sampling results in more powerful models with higher accuracy.

View on arXiv
Comments on this paper