ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1703.05840
442
29
v1v2v3v4v5 (latest)

Conditional Accelerated Lazy Stochastic Gradient Descent

16 March 2017
Guanghui Lan
Sebastian Pokutta
Yi Zhou
Daniel Zink
ArXiv (abs)PDFHTML
Abstract

In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate O(1ε2)O\left(\frac{1}{\varepsilon^2}\right)O(ε21​) improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of Hazan and Kale [2012] with convergence rate O(1ε4)O\left(\frac{1}{\varepsilon^4}\right)O(ε41​).

View on arXiv
Comments on this paper