193
v1v2v3v4 (latest)

Borsuk-Ulam and Replicable Learning of Large-Margin Halfspaces

Main:17 Pages
Bibliography:4 Pages
Appendix:5 Pages
Abstract

We prove that the list replicability number of dd-dimensional γ\gamma-margin half-spaces satisfies \[ \frac{d}{2}+1 \le \mathrm{LR}(H^d_\gamma) \le d, \] which grows with dimension. This resolves several open problems:\bullet Every disambiguation of infinite-dimensional large-margin half-spaces to a total concept class has unbounded Littlestone dimension, answering an open question of Alon, Hanneke, Holzman, and Moran (FOCS '21).\bullet Every disambiguation of the Gap Hamming Distance problem in the large gap regime has unbounded public-coin randomized communication complexity. This answers an open question of Fang, Göös, Harms, and Hatami (STOC '25).\bullet There is a separation of O(1)O(1) vs ω(1)\omega(1) between randomized and pseudo-deterministic communication complexity.\bullet The maximum list-replicability number of any finite set of points and homogeneous half-spaces in dd-dimensional Euclidean space is dd, resolving a problem of Chase, Moran, and Yehudayoff (FOCS '23).\bullet There exists a partial concept class with Littlestone dimension 11 such that all its disambiguations have infinite Littlestone dimension. This resolves a problem of Cheung, H. Hatami, P. Hatami, and Hosseini (ICALP '23).Our lower bound follows from a topological argument based on a local Borsuk-Ulam theorem. For the upper bound, we construct a list-replicable learning rule using the generalization properties of SVMs.

View on arXiv
Comments on this paper