ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.09261
  4. Cited By
Adaptive First-and Zeroth-order Methods for Weakly Convex Stochastic
  Optimization Problems
v1v2 (latest)

Adaptive First-and Zeroth-order Methods for Weakly Convex Stochastic Optimization Problems

19 May 2020
Parvin Nazari
Davoud Ataee Tarzanagh
George Michailidis
    ODL
ArXiv (abs)PDFHTML

Papers citing "Adaptive First-and Zeroth-order Methods for Weakly Convex Stochastic Optimization Problems"

7 / 7 papers shown
Title
On the Algorithmic Stability and Generalization of Adaptive Optimization
  Methods
On the Algorithmic Stability and Generalization of Adaptive Optimization Methods
Han Nguyen
Hai Pham
Sashank J. Reddi
Barnabas Poczos
ODLAI4CE
101
2
0
08 Nov 2022
Dynamic Regret of Adaptive Gradient Methods for Strongly Convex Problems
Dynamic Regret of Adaptive Gradient Methods for Strongly Convex Problems
Parvin Nazari
E. Khorram
ODL
103
3
0
04 Sep 2022
Dynamic Regret Analysis for Online Meta-Learning
Dynamic Regret Analysis for Online Meta-Learning
Parvin Nazari
E. Khorram
CLL
86
5
0
29 Sep 2021
Distributed stochastic inertial-accelerated methods with delayed
  derivatives for nonconvex problems
Distributed stochastic inertial-accelerated methods with delayed derivatives for nonconvex problems
Yangyang Xu
Yibo Xu
Yonggui Yan
Jiewei Chen
57
4
0
24 Jul 2021
Solving a class of non-convex min-max games using adaptive momentum
  methods
Solving a class of non-convex min-max games using adaptive momentum methods
Babak Barazandeh
Davoud Ataee Tarzanagh
George Michailidis
87
13
0
26 Apr 2021
Regularized and Smooth Double Core Tensor Factorization for
  Heterogeneous Data
Regularized and Smooth Double Core Tensor Factorization for Heterogeneous Data
Davoud Ataee Tarzanagh
George Michailidis
75
5
0
24 Nov 2019
DADAM: A Consensus-based Distributed Adaptive Gradient Method for Online
  Optimization
DADAM: A Consensus-based Distributed Adaptive Gradient Method for Online Optimization
Parvin Nazari
Davoud Ataee Tarzanagh
George Michailidis
ODL
98
68
0
25 Jan 2019
1