ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.11442
  4. Cited By
Towards Noise-adaptive, Problem-adaptive (Accelerated) Stochastic
  Gradient Descent

Towards Noise-adaptive, Problem-adaptive (Accelerated) Stochastic Gradient Descent

21 October 2021
Sharan Vaswani
Benjamin Dubois-Taine
Reza Babanezhad
ArXivPDFHTML

Papers citing "Towards Noise-adaptive, Problem-adaptive (Accelerated) Stochastic Gradient Descent"

5 / 5 papers shown
Title
An Accelerated Algorithm for Stochastic Bilevel Optimization under Unbounded Smoothness
An Accelerated Algorithm for Stochastic Bilevel Optimization under Unbounded Smoothness
Xiaochuan Gong
Jie Hao
Mingrui Liu
29
2
0
28 Sep 2024
Faster Convergence of Stochastic Accelerated Gradient Descent under Interpolation
Faster Convergence of Stochastic Accelerated Gradient Descent under Interpolation
Aaron Mishkin
Mert Pilanci
Mark Schmidt
49
0
0
03 Apr 2024
Two Sides of One Coin: the Limits of Untuned SGD and the Power of
  Adaptive Methods
Two Sides of One Coin: the Limits of Untuned SGD and the Power of Adaptive Methods
Junchi Yang
Xiang Li
Ilyas Fatkhullin
Niao He
26
15
0
21 May 2023
Analyzing Monotonic Linear Interpolation in Neural Network Loss
  Landscapes
Analyzing Monotonic Linear Interpolation in Neural Network Loss Landscapes
James Lucas
Juhan Bae
Michael Ruogu Zhang
Stanislav Fort
R. Zemel
Roger C. Grosse
MoMe
146
28
0
22 Apr 2021
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
119
1,190
0
16 Aug 2016
1