ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.09979
  4. Cited By
Stochastic Conjugate Gradient Algorithm with Variance Reduction
v1v2 (latest)

Stochastic Conjugate Gradient Algorithm with Variance Reduction

IEEE Transactions on Neural Networks and Learning Systems (IEEE TNNLS), 2017
27 October 2017
Xiaobo Jin
Xu-Yao Zhang
Kaizhu Huang
Guanggang Geng
ArXiv (abs)PDFHTML

Papers citing "Stochastic Conjugate Gradient Algorithm with Variance Reduction"

6 / 6 papers shown
Distributed stochastic gradient tracking algorithm with variance
  reduction for non-convex optimization
Distributed stochastic gradient tracking algorithm with variance reduction for non-convex optimizationIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021
Xia Jiang
Xianlin Zeng
Jian Sun
Jie Chen
147
18
0
28 Jun 2021
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Adaptive Learning Rate and Momentum for Training Deep Neural Networks
Zhiyong Hao
Yixuan Jiang
Huihua Yu
H. Chiang
ODL
116
14
0
22 Jun 2021
FastAdaBelief: Improving Convergence Rate for Belief-based Adaptive
  Optimizers by Exploiting Strong Convexity
FastAdaBelief: Improving Convergence Rate for Belief-based Adaptive Optimizers by Exploiting Strong ConvexityIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021
Yangfan Zhou
Kaizhu Huang
Cheng Cheng
Xuguang Wang
Amir Hussain
Xin Liu
ODL
235
13
0
28 Apr 2021
A variable metric mini-batch proximal stochastic recursive gradient
  algorithm with diagonal Barzilai-Borwein stepsize
A variable metric mini-batch proximal stochastic recursive gradient algorithm with diagonal Barzilai-Borwein stepsize
Tengteng Yu
Xinwei Liu
Yuhong Dai
Jie Sun
229
4
0
02 Oct 2020
Accelerating Mini-batch SARAH by Step Size Rules
Accelerating Mini-batch SARAH by Step Size RulesInformation Sciences (Inf. Sci.), 2019
Zhuang Yang
Zengping Chen
Cheng-Yu Wang
243
15
0
20 Jun 2019
A Survey of Optimization Methods from a Machine Learning Perspective
A Survey of Optimization Methods from a Machine Learning PerspectiveIEEE Transactions on Cybernetics (IEEE Trans. Cybern.), 2019
Shiliang Sun
Zehui Cao
Han Zhu
Jing Zhao
226
631
0
17 Jun 2019
1
Page 1 of 1