Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2012.05625
Cited By
v1
v2
v3
v4 (latest)
DONE: Distributed Approximate Newton-type Method for Federated Edge Learning
IEEE Transactions on Parallel and Distributed Systems (TPDS), 2020
10 December 2020
Canh T. Dinh
N. H. Tran
Tuan Dung Nguyen
Wei Bao
A. R. Balef
B. Zhou
Albert Y. Zomaya
FedML
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DONE: Distributed Approximate Newton-type Method for Federated Edge Learning"
8 / 8 papers shown
Accelerated Training of Federated Learning via Second-Order Methods
Mrinmay Sen
Sidhant R Nair
C Krishna Mohan
FedML
256
0
0
29 May 2025
Review of Mathematical Optimization in Federated Learning
Shusen Yang
Fangyuan Zhao
Zihao Zhou
Liang Shi
Xuebin Ren
Zongben Xu
FedML
AI4CE
418
6
0
02 Dec 2024
Scalable and Resource-Efficient Second-Order Federated Learning via Over-the-Air Aggregation
IEEE Wireless Communications Letters (WCL), 2024
Abdulmomen Ghalkha
Chaouki Ben Issaid
Mehdi Bennis
381
1
0
10 Oct 2024
Fed-Sophia: A Communication-Efficient Second-Order Federated Learning Algorithm
Ahmed Elbakary
Chaouki Ben Issaid
Mohammad Shehab
Karim G. Seddik
Tamer A. ElBatt
Mehdi Bennis
300
8
0
10 Jun 2024
FAGH: Accelerating Federated Learning with Approximated Global Hessian
Mrinmay Sen
A. K. Qin
Krishna Mohan
FedML
305
2
0
16 Mar 2024
Q-SHED: Distributed Optimization at the Edge via Hessian Eigenvectors Quantization
Nicolò Dal Fabbro
M. Rossi
Luca Schenato
S. Dey
351
1
0
18 May 2023
Network-GIANT: Fully distributed Newton-type optimization via harmonic Hessian consensus
A. Maritan
Ganesh Sharma
Luca Schenato
S. Dey
317
5
0
13 May 2023
SHED: A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing
Nicolò Dal Fabbro
S. Dey
M. Rossi
Luca Schenato
FedML
449
19
0
11 Feb 2022
1
Page 1 of 1