ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1412.7419
  4. Cited By
ADASECANT: Robust Adaptive Secant Method for Stochastic Gradient

ADASECANT: Robust Adaptive Secant Method for Stochastic Gradient

23 December 2014
Çağlar Gülçehre
Marcin Moczulski
Yoshua Bengio
    ODL
ArXivPDFHTML

Papers citing "ADASECANT: Robust Adaptive Secant Method for Stochastic Gradient"

4 / 4 papers shown
Title
PSO-Convolutional Neural Networks with Heterogeneous Learning Rate
PSO-Convolutional Neural Networks with Heterogeneous Learning Rate
N. H. Phong
A. Santos
B. Ribeiro
27
8
0
20 May 2022
Empirical study towards understanding line search approximations for
  training neural networks
Empirical study towards understanding line search approximations for training neural networks
Younghwan Chae
D. Wilke
27
11
0
15 Sep 2019
Super-Convergence: Very Fast Training of Neural Networks Using Large
  Learning Rates
Super-Convergence: Very Fast Training of Neural Networks Using Large Learning Rates
L. Smith
Nicholay Topin
AI4CE
39
519
0
23 Aug 2017
A Robust Adaptive Stochastic Gradient Method for Deep Learning
A Robust Adaptive Stochastic Gradient Method for Deep Learning
Çağlar Gülçehre
Jose M. R. Sotelo
Marcin Moczulski
Yoshua Bengio
ODL
24
25
0
02 Mar 2017
1