ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.13257
9
7

Finite Littlestone Dimension Implies Finite Information Complexity

27 June 2022
Aditya Pradeep
Ido Nachum
Michael C. Gastpar
ArXivPDFHTML
Abstract

We prove that every online learnable class of functions of Littlestone dimension ddd admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in ddd. We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension ddd, the information complexity can be upper bounded logarithmically in ddd.

View on arXiv
Comments on this paper