Faster Principal Component Regression via Optimal Polynomial
Approximation to sgn(x)
International Conference on Machine Learning (ICML), 2016
Abstract
We solve principle component regression (PCR) by providing an efficient algorithm to project any vector onto the subspace formed by the top principle components of a matrix. Our algorithm does not require any explicit construction of the top principle components, and therefore is suitable for large-scale PCR instances. Specifically, to project onto the subspace formed by principle components with eigenvalues above a threshold and with a multiplicative accuracy , our algorithm requires black-box calls of ridge regression. In contrast, previous result requires such calls. We obtain this result by designing a degree-optimal polynomial approximation of the sign function.
View on arXivComments on this paper
