ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.06749
  4. Cited By
A decreasing scaling transition scheme from Adam to SGD
v1v2 (latest)

A decreasing scaling transition scheme from Adam to SGD

Advanced Theory and Simulations (ATS), 2021
12 June 2021
Kun Zeng
Jinlan Liu
Zhixia Jiang
Dongpo Xu
    ODL
ArXiv (abs)PDFHTMLGithub

Papers citing "A decreasing scaling transition scheme from Adam to SGD"

1 / 1 papers shown
Enhancing Deep Neural Network Training Efficiency and Performance
  through Linear Prediction
Enhancing Deep Neural Network Training Efficiency and Performance through Linear PredictionScientific Reports (Sci Rep), 2023
Hejie Ying
Mengmeng Song
Yaohong Tang
S. Xiao
Zimin Xiao
251
20
0
17 Oct 2023
1
Page 1 of 1