ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.11386
17
20

Uniform regret bounds over RdR^dRd for the sequential linear regression problem with the square loss

29 May 2018
Pierre Gaillard
Sébastien Gerchinovitz
Malo Huard
Gilles Stoltz
ArXivPDFHTML
Abstract

We consider the setting of online linear regression for arbitrary deterministic sequences, with the square loss. We are interested in the aim set by Bartlett et al. (2015): obtain regret bounds that hold uniformly over all competitor vectors. When the feature sequence is known at the beginning of the game, they provided closed-form regret bounds of 2dB2ln⁡T+OT(1)2d B^2 \ln T + \mathcal{O}_T(1)2dB2lnT+OT​(1), where TTT is the number of rounds and BBB is a bound on the observations. Instead, we derive bounds with an optimal constant of 111 in front of the dB2ln⁡Td B^2 \ln TdB2lnT term. In the case of sequentially revealed features, we also derive an asymptotic regret bound of dB2ln⁡Td B^2 \ln TdB2lnT for any individual sequence of features and bounded observations. All our algorithms are variants of the online non-linear ridge regression forecaster, either with a data-dependent regularization or with almost no regularization.

View on arXiv
Comments on this paper