Robust, randomized preconditioning for kernel ridge regression
Main:24 Pages
14 Figures
Bibliography:6 Pages
2 Tables
Abstract
This paper investigates preconditioned conjugate gradient techniques for solving kernel ridge regression (KRR) problems with a medium to large number of data points (), and it describes two methods with the strongest guarantees available. The first method, RPCholesky preconditioning, accurately solves the full-data KRR problem in arithmetic operations, assuming sufficiently rapid polynomial decay of the kernel matrix eigenvalues. The second method, KRILL preconditioning, offers an accurate solution to a restricted version of the KRR problem involving selected data centers at a cost of operations. The proposed methods efficiently solve a range of KRR problems, making them well-suited for practical applications.
View on arXivComments on this paper
