83

On Agnostic PAC Learning using L2\mathcal{L}_2-polynomial Regression and Fourier-based Algorithms

Abstract

We develop a framework using Hilbert spaces as a proxy to analyze PAC learning problems with structural properties. We consider a joint Hilbert space incorporating the relation between the true label and the predictor under a joint distribution DD. We demonstrate that agnostic PAC learning with 0-1 loss is equivalent to an optimization in the Hilbert space domain. With our model, we revisit the PAC learning problem using methods based on least-squares such as L2\mathcal{L}_2 polynomial regression and Linial's low-degree algorithm. We study learning with respect to several hypothesis classes such as half-spaces and polynomial-approximated classes (i.e., functions approximated by a fixed-degree polynomial). We prove that (under some distributional assumptions) such methods obtain generalization error up to 2opt2opt with optopt being the optimal error of the class. Hence, we show the tightest bound on generalization error when opt0.2opt\leq 0.2.

View on arXiv
Comments on this paper