v1v2 (latest)
Conjugate-gradient-based Adam for stochastic optimization and its
application to deep learning

Abstract
This paper proposes a conjugate-gradient-based Adam algorithm blending Adam with nonlinear conjugate gradient methods and shows its convergence analysis. Numerical experiments on text classification and image classification show that the proposed algorithm can train deep neural network models in fewer epochs than the existing adaptive stochastic optimization algorithms can.
View on arXivComments on this paper