ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.00231
  4. Cited By
Conjugate-gradient-based Adam for stochastic optimization and its
  application to deep learning
v1v2 (latest)

Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning

29 February 2020
Yu Kobayashi
Hideaki Iiduka
ArXiv (abs)PDFHTML

Papers citing "Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning"

2 / 2 papers shown
Conjugate-Gradient-like Based Adaptive Moment Estimation Optimization Algorithm for Deep Learning
Conjugate-Gradient-like Based Adaptive Moment Estimation Optimization Algorithm for Deep Learning
Jiawu Tian
Liwei Xu
Xiaowei Zhang
Yongqi Li
ODL
424
0
0
02 Apr 2024
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Zhiyong Hao
Yixuan Jiang
Huihua Yu
H. Chiang
ODL
188
14
0
22 Jun 2021
1
Page 1 of 1