ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.03125
  4. Cited By
Sharper Analysis for Minibatch Stochastic Proximal Point Methods:
  Stability, Smoothness, and Deviation

Sharper Analysis for Minibatch Stochastic Proximal Point Methods: Stability, Smoothness, and Deviation

9 January 2023
Xiao-Tong Yuan
P. Li
ArXivPDFHTML

Papers citing "Sharper Analysis for Minibatch Stochastic Proximal Point Methods: Stability, Smoothness, and Deviation"

6 / 6 papers shown
Title
Cohort Squeeze: Beyond a Single Communication Round per Cohort in
  Cross-Device Federated Learning
Cohort Squeeze: Beyond a Single Communication Round per Cohort in Cross-Device Federated Learning
Kai Yi
Timur Kharisov
Igor Sokolov
Peter Richtárik
FedML
32
2
0
03 Jun 2024
Uniformly Stable Algorithms for Adversarial Training and Beyond
Uniformly Stable Algorithms for Adversarial Training and Beyond
Jiancong Xiao
Jiawei Zhang
Zhimin Luo
Asuman Ozdaglar
AAML
40
0
0
03 May 2024
Accelerated, Optimal, and Parallel: Some Results on Model-Based
  Stochastic Optimization
Accelerated, Optimal, and Parallel: Some Results on Model-Based Stochastic Optimization
Karan N. Chadha
Gary Cheng
John C. Duchi
49
16
0
07 Jan 2021
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
119
1,198
0
16 Aug 2016
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
76
736
0
19 Mar 2014
High-dimensional generalized linear models and the lasso
High-dimensional generalized linear models and the lasso
Sara van de Geer
189
748
0
04 Apr 2008
1