57
3
v1v2v3 (latest)

Tight Bounds on 1\ell_1 Approximation and Learning of Self-Bounding Functions

Abstract

We study the complexity of learning and approximation of self-bounding functions over the uniform distribution on the Boolean hypercube 0,1n{0,1}^n. Informally, a function f:0,1nRf:{0,1}^n \rightarrow \mathbb{R} is self-bounding if for every x0,1nx \in {0,1}^n, f(x)f(x) upper bounds the sum of all the nn marginal decreases in the value of the function at xx. Self-bounding functions include such well-known classes of functions as submodular and fractionally-subadditive (XOS) functions. They were introduced by Boucheron et al. (2000) in the context of concentration of measure inequalities. Our main result is a nearly tight 1\ell_1-approximation of self-bounding functions by low-degree juntas. Specifically, all self-bounding functions can be ϵ\epsilon-approximated in 1\ell_1 by a polynomial of degree O~(1/ϵ)\tilde{O}(1/\epsilon) over 2O~(1/ϵ)2^{\tilde{O}(1/\epsilon)} variables. We show that both the degree and junta-size are optimal up to logarithmic terms. Previous techniques considered stronger 2\ell_2 approximation and proved nearly tight bounds of Θ(1/ϵ2)\Theta(1/\epsilon^{2}) on the degree and 2Θ(1/ϵ2)2^{\Theta(1/\epsilon^2)} on the number of variables. Our bounds rely on the analysis of noise stability of self-bounding functions together with a stronger connection between noise stability and 1\ell_1 approximation by low-degree polynomials. This technique can also be used to get tighter bounds on 1\ell_1 approximation by low-degree polynomials and faster learning algorithm for halfspaces. These results lead to improved and in several cases almost tight bounds for PAC and agnostic learning of self-bounding functions relative to the uniform distribution. In particular, assuming hardness of learning juntas, we show that PAC and agnostic learning of self-bounding functions have complexity of nΘ~(1/ϵ)n^{\tilde{\Theta}(1/\epsilon)}.

View on arXiv
Comments on this paper