ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.06749
  4. Cited By
A decreasing scaling transition scheme from Adam to SGD
v1v2 (latest)

A decreasing scaling transition scheme from Adam to SGD

12 June 2021
Kun Zeng
Jinlan Liu
Zhixia Jiang
Dongpo Xu
    ODL
ArXiv (abs)PDFHTMLGithub

Papers citing "A decreasing scaling transition scheme from Adam to SGD"

1 / 1 papers shown
Title
Enhancing Deep Neural Network Training Efficiency and Performance
  through Linear Prediction
Enhancing Deep Neural Network Training Efficiency and Performance through Linear Prediction
Hejie Ying
Mengmeng Song
Yaohong Tang
S. Xiao
Zimin Xiao
73
10
0
17 Oct 2023
1