ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1404.4702
59
3
v1v2v3 (latest)

Nearly Tight Bounds on ℓ1\ell_1ℓ1​ Approximation of Self-Bounding Functions

18 April 2014
Vitaly Feldman
Pravesh Kothari
J. Vondrák
ArXiv (abs)PDFHTML
Abstract

We study the complexity of learning and approximation of self-bounding functions over the uniform distribution on the Boolean hypercube 0,1n{0,1}^n0,1n. Informally, a function f:0,1n→Rf:{0,1}^n \rightarrow \mathbb{R}f:0,1n→R is self-bounding if for every x∈0,1nx \in {0,1}^nx∈0,1n, f(x)f(x)f(x) upper bounds the sum of all the nnn marginal decreases in the value of the function at xxx. Self-bounding functions include such well-known classes of functions as submodular and fractionally-subadditive (XOS) functions. They were introduced by Boucheron et al in the context of concentration of measure inequalities. Our main result is a nearly tight ℓ1\ell_1ℓ1​-approximation of self-bounding functions by low-degree juntas. Specifically, all self-bounding functions can be ϵ\epsilonϵ-approximated in ℓ1\ell_1ℓ1​ by a polynomial of degree O~(1/ϵ)\tilde{O}(1/\epsilon)O~(1/ϵ) over 2O~(1/ϵ)2^{\tilde{O}(1/\epsilon)}2O~(1/ϵ) variables. Both the degree and junta-size are optimal up to logarithmic terms. Previously, the best known bound was O(1/ϵ2)O(1/\epsilon^{2})O(1/ϵ2) on the degree and 2O(1/ϵ2)2^{O(1/\epsilon^2)}2O(1/ϵ2) on the number of variables (Feldman and Vondr \'{a}k 2013). These results lead to improved and in several cases almost tight bounds for PAC and agnostic learning of submodular, XOS and self-bounding functions. In particular, assuming hardness of learning juntas, we show that PAC and agnostic learning of self-bounding functions have complexity of nΘ~(1/ϵ)n^{\tilde{\Theta}(1/\epsilon)}nΘ~(1/ϵ).

View on arXiv
Comments on this paper