ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
  • Feedback
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.00231
  4. Cited By
Conjugate-gradient-based Adam for stochastic optimization and its
  application to deep learning
v1v2 (latest)

Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning

29 February 2020
Yu Kobayashi
Hideaki Iiduka
ArXiv (abs)PDFHTML

Papers citing "Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning"

2 / 2 papers shown
Title
Conjugate-Gradient-like Based Adaptive Moment Estimation Optimization Algorithm for Deep Learning
Conjugate-Gradient-like Based Adaptive Moment Estimation Optimization Algorithm for Deep Learning
Jiawu Tian
Liwei Xu
Xiaowei Zhang
Yongqi Li
ODL
207
0
0
02 Apr 2024
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Zhiyong Hao
Yixuan Jiang
Huihua Yu
H. Chiang
ODL
77
10
0
22 Jun 2021
1