ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1609.03261
13
95

Less than a Single Pass: Stochastically Controlled Stochastic Gradient Method

12 September 2016
Lihua Lei
Michael I. Jordan
ArXivPDFHTML
Abstract

We develop and analyze a procedure for gradient-based optimization that we refer to as stochastically controlled stochastic gradient (SCSG). As a member of the SVRG family of algorithms, SCSG makes use of gradient estimates at two scales, with the number of updates at the faster scale being governed by a geometric random variable. Unlike most existing algorithms in this family, both the computation cost and the communication cost of SCSG do not necessarily scale linearly with the sample size nnn; indeed, these costs are independent of nnn when the target accuracy is low. An experimental evaluation on real datasets confirms the effectiveness of SCSG.

View on arXiv
Comments on this paper