ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.05679
  4. Cited By
ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite
  Nonconvex Optimization

ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization

15 February 2019
Nhan H. Pham
Lam M. Nguyen
Dzung Phan
Quoc Tran-Dinh
ArXivPDFHTML

Papers citing "ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization"

13 / 13 papers shown
Title
A One-Sample Decentralized Proximal Algorithm for Non-Convex Stochastic
  Composite Optimization
A One-Sample Decentralized Proximal Algorithm for Non-Convex Stochastic Composite Optimization
Tesi Xiao
Xuxing Chen
Krishnakumar Balasubramanian
Saeed Ghadimi
21
10
0
20 Feb 2023
Stochastic Variable Metric Proximal Gradient with variance reduction for
  non-convex composite optimization
Stochastic Variable Metric Proximal Gradient with variance reduction for non-convex composite optimization
G. Fort
Eric Moulines
35
6
0
02 Jan 2023
Gradient Descent-Type Methods: Background and Simple Unified Convergence
  Analysis
Gradient Descent-Type Methods: Background and Simple Unified Convergence Analysis
Quoc Tran-Dinh
Marten van Dijk
14
0
0
19 Dec 2022
Stochastic Gradient Methods with Preconditioned Updates
Stochastic Gradient Methods with Preconditioned Updates
Abdurakhmon Sadiev
Aleksandr Beznosikov
Abdulla Jasem Almansoori
Dmitry Kamzolov
R. Tappenden
Martin Takáč
ODL
19
9
0
01 Jun 2022
A Novel Convergence Analysis for Algorithms of the Adam Family
A Novel Convergence Analysis for Algorithms of the Adam Family
Zhishuai Guo
Yi Tian Xu
W. Yin
R. L. Jin
Tianbao Yang
31
46
0
07 Dec 2021
Faster Perturbed Stochastic Gradient Methods for Finding Local Minima
Faster Perturbed Stochastic Gradient Methods for Finding Local Minima
Zixiang Chen
Dongruo Zhou
Quanquan Gu
18
1
0
25 Oct 2021
ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method
ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method
Zhize Li
23
14
0
21 Mar 2021
PAGE: A Simple and Optimal Probabilistic Gradient Estimator for
  Nonconvex Optimization
PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization
Zhize Li
Hongyan Bao
Xiangliang Zhang
Peter Richtárik
ODL
12
124
0
25 Aug 2020
Privacy-Preserving Asynchronous Federated Learning Algorithms for
  Multi-Party Vertically Collaborative Learning
Privacy-Preserving Asynchronous Federated Learning Algorithms for Multi-Party Vertically Collaborative Learning
Bin Gu
An Xu
Zhouyuan Huo
Cheng Deng
Heng-Chiao Huang
FedML
18
27
0
14 Aug 2020
A Unified Convergence Analysis for Shuffling-Type Gradient Methods
A Unified Convergence Analysis for Shuffling-Type Gradient Methods
Lam M. Nguyen
Quoc Tran-Dinh
Dzung Phan
Phuong Ha Nguyen
Marten van Dijk
12
77
0
19 Feb 2020
Stochastic First-order Methods for Convex and Nonconvex Functional
  Constrained Optimization
Stochastic First-order Methods for Convex and Nonconvex Functional Constrained Optimization
Digvijay Boob
Qi Deng
Guanghui Lan
37
92
0
07 Aug 2019
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
119
1,194
0
16 Aug 2016
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
76
736
0
19 Mar 2014
1