ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.06903
  4. Cited By
Generalization Error Bounds with Probabilistic Guarantee for SGD in
  Nonconvex Optimization

Generalization Error Bounds with Probabilistic Guarantee for SGD in Nonconvex Optimization

19 February 2018
Yi Zhou
Yingbin Liang
Huishuai Zhang
    MLT
ArXivPDFHTML

Papers citing "Generalization Error Bounds with Probabilistic Guarantee for SGD in Nonconvex Optimization"

5 / 5 papers shown
Title
On the Algorithmic Stability and Generalization of Adaptive Optimization
  Methods
On the Algorithmic Stability and Generalization of Adaptive Optimization Methods
Han Nguyen
Hai Pham
Sashank J. Reddi
Barnabás Póczos
ODL
AI4CE
17
2
0
08 Nov 2022
Stability and Generalization for Markov Chain Stochastic Gradient
  Methods
Stability and Generalization for Markov Chain Stochastic Gradient Methods
Puyu Wang
Yunwen Lei
Yiming Ying
Ding-Xuan Zhou
16
18
0
16 Sep 2022
Improved Learning Rates for Stochastic Optimization: Two Theoretical
  Viewpoints
Improved Learning Rates for Stochastic Optimization: Two Theoretical Viewpoints
Shaojie Li
Yong Liu
20
13
0
19 Jul 2021
Stagewise Training Accelerates Convergence of Testing Error Over SGD
Stagewise Training Accelerates Convergence of Testing Error Over SGD
Zhuoning Yuan
Yan Yan
R. L. Jin
Tianbao Yang
47
11
0
10 Dec 2018
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
139
1,199
0
16 Aug 2016
1