ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.07378
  4. Cited By
Distributed Adaptive Newton Methods with Global Superlinear Convergence
v1v2v3 (latest)

Distributed Adaptive Newton Methods with Global Superlinear Convergence

18 February 2020
Jiaqi Zhang
Keyou You
Tamer Basar
ArXiv (abs)PDFHTML

Papers citing "Distributed Adaptive Newton Methods with Global Superlinear Convergence"

2 / 2 papers shown
Title
Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized
  Learning: Part I
Variance-Reduced Stochastic Quasi-Newton Methods for Decentralized Learning: Part I
Jiaojiao Zhang
Huikang Liu
Anthony Man-Cho So
Qing Ling
90
15
0
19 Jan 2022
Newton Method over Networks is Fast up to the Statistical Precision
Newton Method over Networks is Fast up to the Statistical Precision
Amir Daneshmand
G. Scutari
Pavel Dvurechensky
Alexander Gasnikov
64
22
0
12 Feb 2021
1