Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.07504
Cited By
Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis
15 April 2023
Dachao Lin
Yuze Han
Haishan Ye
Zhihua Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis"
14 / 14 papers shown
Title
Accelerated Methods with Compressed Communications for Distributed Optimization Problems under Data Similarity
Dmitry Bylinkin
Aleksandr Beznosikov
67
1
0
21 Dec 2024
Accelerated Stochastic ExtraGradient: Mixing Hessian and Gradient Similarity to Reduce Communication in Distributed and Federated Learning
Dmitry Bylinkin
Kirill Degtyarev
Aleksandr Beznosikov
FedML
31
0
0
22 Sep 2024
Nonlinear Perturbation-based Non-Convex Optimization over Time-Varying Networks
Mohammadreza Doostmohammadian
Zulfiya R. Gabidullina
Hamid R. Rabiee
30
7
0
05 Aug 2024
Stabilized Proximal-Point Methods for Federated Optimization
Xiaowen Jiang
Anton Rodomanov
Sebastian U. Stich
FedML
44
1
0
09 Jul 2024
Cohort Squeeze: Beyond a Single Communication Round per Cohort in Cross-Device Federated Learning
Kai Yi
Timur Kharisov
Igor Sokolov
Peter Richtárik
FedML
32
2
0
03 Jun 2024
SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Non-convex Cross-Device Federated Learning
Avetik G. Karagulyan
Egor Shulgin
Abdurakhmon Sadiev
Peter Richtárik
FedML
35
2
0
30 May 2024
Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity
Qihao Zhou
Haishan Ye
Luo Luo
24
0
0
25 May 2024
The Effectiveness of Local Updates for Decentralized Learning under Data Heterogeneity
Tongle Wu
Ying Sun
28
0
0
23 Mar 2024
Faster federated optimization under second-order similarity
Ahmed Khaled
Chi Jin
FedML
48
17
0
06 Sep 2022
Optimal Algorithms for Decentralized Stochastic Variational Inequalities
D. Kovalev
Aleksandr Beznosikov
Abdurakhmon Sadiev
Michael Persiianov
Peter Richtárik
Alexander Gasnikov
33
34
0
06 Feb 2022
One-Point Gradient-Free Methods for Composite Optimization with Applications to Distributed Optimization
I. Stepanov
Artyom Y. Voronov
Aleksandr Beznosikov
Alexander Gasnikov
FedML
18
8
0
13 Jul 2021
Acceleration Methods
Alexandre d’Aspremont
Damien Scieur
Adrien B. Taylor
81
119
0
23 Jan 2021
Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization
Zeyuan Allen-Zhu
ODL
42
52
0
12 Feb 2018
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
76
736
0
19 Mar 2014
1