287
v1v2v3 (latest)

New Statistical and Computational Results for Learning Junta Distributions

International Workshop and International Workshop on Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM), 2025
Main:21 Pages
2 Figures
Bibliography:3 Pages
1 Tables
Abstract

We study the problem of learning junta distributions on {0,1}n\{0, 1\}^n, where a distribution is a kk-junta if its probability mass function depends on a subset of at most kk variables. We make two main contributions:- We show that learning kk-junta distributions is \emph{computationally} equivalent to learning kk-parity functions with noise (LPN), a landmark problem in computational learning theory.- We design an algorithm for learning junta distributions whose statistical complexity is optimal, up to polylogarithmic factors. Computationally, our algorithm matches the complexity of previous (non-sample-optimal) algorithms.Combined, our two contributions imply that our algorithm cannot be significantly improved, statistically or computationally, barring a breakthrough for LPN.

View on arXiv
Comments on this paper