177

Faster Principal Component Regression via Optimal Polynomial Approximation to sgn(x)

International Conference on Machine Learning (ICML), 2016
Abstract

We solve principle component regression (PCR) by providing an efficient algorithm to project any vector onto the subspace formed by the top principle components of a matrix. Our algorithm does not require any explicit construction of the top principle components, and therefore is suitable for large-scale PCR instances. Specifically, to project onto the subspace formed by principle components with eigenvalues above a threshold λ\lambda and with a multiplicative accuracy (1±γ)λ(1\pm \gamma) \lambda, our algorithm requires O~(γ1)\tilde{O}(\gamma^{-1}) black-box calls of ridge regression. In contrast, previous result requires O~(γ2)\tilde{O}(\gamma^{-2}) such calls. We obtain this result by designing a degree-optimal polynomial approximation of the sign function.

View on arXiv
Comments on this paper