ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.02341
  4. Cited By
A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization
v1v2v3v4 (latest)

A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization

7 September 2018
Zhize Li
Jian Li
ArXiv (abs)PDFHTML

Papers citing "A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization"

7 / 7 papers shown
Title
Boosting the Performance of Decentralized Federated Learning via
  Catalyst Acceleration
Boosting the Performance of Decentralized Federated Learning via Catalyst Acceleration
Qinglun Li
Miao Zhang
Yingqi Liu
Quanjun Yin
Li Shen
Xiaochun Cao
FedML
94
0
0
09 Oct 2024
FedPAGE: A Fast Local Stochastic Gradient Method for
  Communication-Efficient Federated Learning
FedPAGE: A Fast Local Stochastic Gradient Method for Communication-Efficient Federated Learning
Haoyu Zhao
Zhize Li
Peter Richtárik
FedML
79
29
0
10 Aug 2021
CANITA: Faster Rates for Distributed Convex Optimization with
  Communication Compression
CANITA: Faster Rates for Distributed Convex Optimization with Communication Compression
Zhize Li
Peter Richtárik
70
30
0
20 Jul 2021
ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method
ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method
Zhize Li
113
14
0
21 Mar 2021
Acceleration via Fractal Learning Rate Schedules
Acceleration via Fractal Learning Rate Schedules
Naman Agarwal
Surbhi Goel
Cyril Zhang
78
18
0
01 Mar 2021
PAGE: A Simple and Optimal Probabilistic Gradient Estimator for
  Nonconvex Optimization
PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization
Zhize Li
Hongyan Bao
Xiangliang Zhang
Peter Richtárik
ODL
102
130
0
25 Aug 2020
A Unified Analysis of Stochastic Gradient Methods for Nonconvex
  Federated Optimization
A Unified Analysis of Stochastic Gradient Methods for Nonconvex Federated Optimization
Zhize Li
Peter Richtárik
FedML
93
36
0
12 Jun 2020
1