ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.05273
  4. Cited By
A Second look at Exponential and Cosine Step Sizes: Simplicity,
  Adaptivity, and Performance

A Second look at Exponential and Cosine Step Sizes: Simplicity, Adaptivity, and Performance

12 February 2020
Xiaoyun Li
Zhenxun Zhuang
Francesco Orabona
ArXivPDFHTML

Papers citing "A Second look at Exponential and Cosine Step Sizes: Simplicity, Adaptivity, and Performance"

5 / 5 papers shown
Title
Understanding AdamW through Proximal Methods and Scale-Freeness
Understanding AdamW through Proximal Methods and Scale-Freeness
Zhenxun Zhuang
Mingrui Liu
Ashok Cutkosky
Francesco Orabona
21
61
0
31 Jan 2022
On Uniform Boundedness Properties of SGD and its Momentum Variants
On Uniform Boundedness Properties of SGD and its Momentum Variants
Xiaoyu Wang
M. Johansson
11
3
0
25 Jan 2022
A High Probability Analysis of Adaptive SGD with Momentum
A High Probability Analysis of Adaptive SGD with Momentum
Xiaoyun Li
Francesco Orabona
79
64
0
28 Jul 2020
Bag of Tricks for Image Classification with Convolutional Neural
  Networks
Bag of Tricks for Image Classification with Convolutional Neural Networks
Tong He
Zhi-Li Zhang
Hang Zhang
Zhongyue Zhang
Junyuan Xie
Mu Li
216
1,398
0
04 Dec 2018
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
119
1,194
0
16 Aug 2016
1