ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.07378
  4. Cited By
Distributed Adaptive Newton Methods with Global Superlinear Convergence
v1v2v3 (latest)

Distributed Adaptive Newton Methods with Global Superlinear Convergence

18 February 2020
Jiaqi Zhang
Keyou You
Tamer Basar
ArXiv (abs)PDFHTML

Papers citing "Distributed Adaptive Newton Methods with Global Superlinear Convergence"

6 / 6 papers shown
Distributed Adaptive Greedy Quasi-Newton Methods with Explicit
  Non-asymptotic Convergence Bounds
Distributed Adaptive Greedy Quasi-Newton Methods with Explicit Non-asymptotic Convergence Bounds
Yubo Du
Keyou You
276
7
0
30 Nov 2023
Decentralized Riemannian natural gradient methods with Kronecker-product
  approximations
Decentralized Riemannian natural gradient methods with Kronecker-product approximations
Jiang Hu
Kangkang Deng
Na Li
Shijie Zhao
204
10
0
16 Mar 2023
SHED: A Newton-type algorithm for federated learning based on
  incremental Hessian eigenvector sharing
SHED: A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing
Nicolò Dal Fabbro
S. Dey
M. Rossi
Luca Schenato
FedML
455
19
0
11 Feb 2022
Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized
  Learning: Part I
Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized Learning: Part IIEEE Transactions on Signal Processing (IEEE Trans. Signal Process.), 2022
Jiaojiao Zhang
Huikang Liu
Anthony Man-Cho So
Qing Ling
282
20
0
19 Jan 2022
On Second-order Optimization Methods for Federated Learning
On Second-order Optimization Methods for Federated Learning
Sebastian Bischoff
Stephan Günnemann
Martin Jaggi
Sebastian U. Stich
FedML
209
14
0
06 Sep 2021
Newton Method over Networks is Fast up to the Statistical Precision
Newton Method over Networks is Fast up to the Statistical PrecisionInternational Conference on Machine Learning (ICML), 2021
Amir Daneshmand
G. Scutari
Pavel Dvurechensky
Alexander Gasnikov
204
22
0
12 Feb 2021
1
Page 1 of 1