ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.01847
  4. Cited By
Basis Matters: Better Communication-Efficient Second Order Methods for
  Federated Learning

Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning

2 November 2021
Xun Qian
Rustem Islamov
M. Safaryan
Peter Richtárik
    FedML
ArXivPDFHTML

Papers citing "Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning"

10 / 10 papers shown
Title
Review of Mathematical Optimization in Federated Learning
Review of Mathematical Optimization in Federated Learning
Shusen Yang
Fangyuan Zhao
Zihao Zhou
Liang Shi
Xuebin Ren
Zongben Xu
FedML
AI4CE
67
0
0
02 Dec 2024
A second-order-like optimizer with adaptive gradient scaling for deep
  learning
A second-order-like optimizer with adaptive gradient scaling for deep learning
Jérôme Bolte
Ryan Boustany
Edouard Pauwels
Andrei Purica
ODL
25
0
0
08 Oct 2024
FAGH: Accelerating Federated Learning with Approximated Global Hessian
FAGH: Accelerating Federated Learning with Approximated Global Hessian
Mrinmay Sen
A. K. Qin
Krishna Mohan
FedML
14
0
0
16 Mar 2024
Improved Communication Efficiency in Federated Natural Policy Gradient
  via ADMM-based Gradient Updates
Improved Communication Efficiency in Federated Natural Policy Gradient via ADMM-based Gradient Updates
Guangchen Lan
Han Wang
James Anderson
Christopher G. Brinton
Vaneet Aggarwal
FedML
19
27
0
09 Oct 2023
Towards a Better Theoretical Understanding of Independent Subnetwork
  Training
Towards a Better Theoretical Understanding of Independent Subnetwork Training
Egor Shulgin
Peter Richtárik
AI4CE
16
6
0
28 Jun 2023
Communication Acceleration of Local Gradient Methods via an Accelerated
  Primal-Dual Algorithm with Inexact Prox
Communication Acceleration of Local Gradient Methods via an Accelerated Primal-Dual Algorithm with Inexact Prox
Abdurakhmon Sadiev
D. Kovalev
Peter Richtárik
15
20
0
08 Jul 2022
FedSSO: A Federated Server-Side Second-Order Optimization Algorithm
FedSSO: A Federated Server-Side Second-Order Optimization Algorithm
Xinteng Ma
Renyi Bao
Jinpeng Jiang
Yang Liu
Arthur Jiang
Junhua Yan
Xin Liu
Zhisong Pan
FedML
21
6
0
20 Jun 2022
Distributed Newton-Type Methods with Communication Compression and
  Bernoulli Aggregation
Distributed Newton-Type Methods with Communication Compression and Bernoulli Aggregation
Rustem Islamov
Xun Qian
Slavomír Hanzely
M. Safaryan
Peter Richtárik
30
16
0
07 Jun 2022
Hessian Averaging in Stochastic Newton Methods Achieves Superlinear
  Convergence
Hessian Averaging in Stochastic Newton Methods Achieves Superlinear Convergence
Sen Na
Michal Derezinski
Michael W. Mahoney
19
16
0
20 Apr 2022
SHED: A Newton-type algorithm for federated learning based on
  incremental Hessian eigenvector sharing
SHED: A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing
Nicolò Dal Fabbro
S. Dey
M. Rossi
Luca Schenato
FedML
26
14
0
11 Feb 2022
1