ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.13603
54
2

Optimal Online Generalized Linear Regression with Stochastic Noise and Its Application to Heteroscedastic Bandits

28 February 2022
Heyang Zhao
Dongruo Zhou
Jiafan He
Quanquan Gu
ArXivPDFHTML
Abstract

We study the problem of online generalized linear regression in the stochastic setting, where the label is generated from a generalized linear model with possibly unbounded additive noise. We provide a sharp analysis of the classical follow-the-regularized-leader (FTRL) algorithm to cope with the label noise. More specifically, for σ\sigmaσ-sub-Gaussian label noise, our analysis provides a regret upper bound of O(σ2dlog⁡T)+o(log⁡T)O(\sigma^2 d \log T) + o(\log T)O(σ2dlogT)+o(logT), where ddd is the dimension of the input vector, TTT is the total number of rounds. We also prove a Ω(σ2dlog⁡(T/d))\Omega(\sigma^2d\log(T/d))Ω(σ2dlog(T/d)) lower bound for stochastic online linear regression, which indicates that our upper bound is nearly optimal. In addition, we extend our analysis to a more refined Bernstein noise condition. As an application, we study generalized linear bandits with heteroscedastic noise and propose an algorithm based on FTRL to achieve the first variance-aware regret bound.

View on arXiv
Comments on this paper