ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.07607
  4. Cited By
Automatic and Simultaneous Adjustment of Learning Rate and Momentum for
  Stochastic Gradient Descent

Automatic and Simultaneous Adjustment of Learning Rate and Momentum for Stochastic Gradient Descent

IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019
20 August 2019
Tomer Lancewicki
Selçuk Köprü
ArXiv (abs)PDFHTML

Papers citing "Automatic and Simultaneous Adjustment of Learning Rate and Momentum for Stochastic Gradient Descent"

2 / 2 papers shown
Weight-Sharing Neural Architecture Search: A Battle to Shrink the
  Optimization Gap
Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
Lingxi Xie
Xin Chen
Kaifeng Bi
Longhui Wei
Yuhui Xu
...
Lanfei Wang
Anxiang Xiao
Jianlong Chang
Xiaopeng Zhang
Qi Tian
ViT
394
118
0
04 Aug 2020
Parabolic Approximation Line Search for DNNs
Parabolic Approximation Line Search for DNNs
Max Mutschler
A. Zell
ODL
309
21
0
28 Mar 2019
1
Page 1 of 1