ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.02882
  4. Cited By
Scaling up Stochastic Gradient Descent for Non-convex Optimisation

Scaling up Stochastic Gradient Descent for Non-convex Optimisation

6 October 2022
S. Mohamad
H. Alamri
A. Bouchachia
ArXivPDFHTML

Papers citing "Scaling up Stochastic Gradient Descent for Non-convex Optimisation"

2 / 2 papers shown
Title
Convergence of Adam for Non-convex Objectives: Relaxed Hyperparameters
  and Non-ergodic Case
Convergence of Adam for Non-convex Objectives: Relaxed Hyperparameters and Non-ergodic Case
Meixuan He
Yuqing Liang
Jinlan Liu
Dongpo Xu
15
8
0
20 Jul 2023
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
171
683
0
07 Dec 2010
1