ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.03135
  4. Cited By
Optimal Statistical Rates for Decentralised Non-Parametric Regression
  with Linear Speed-Up
v1v2 (latest)

Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up

8 May 2019
Dominic Richards
Patrick Rebeschini
ArXiv (abs)PDFHTML

Papers citing "Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up"

5 / 5 papers shown
Title
Beyond spectral gap (extended): The role of the topology in
  decentralized learning
Beyond spectral gap (extended): The role of the topology in decentralized learning
Thijs Vogels
Hadrien Hendrikx
Martin Jaggi
78
3
0
05 Jan 2023
Network Gradient Descent Algorithm for Decentralized Federated Learning
Network Gradient Descent Algorithm for Decentralized Federated Learning
Shuyuan Wu
Danyang Huang
Hansheng Wang
FedML
74
11
0
06 May 2022
A General Framework for Analyzing Stochastic Dynamics in Learning
  Algorithms
A General Framework for Analyzing Stochastic Dynamics in Learning Algorithms
Chi-Ning Chou
Juspreet Singh Sandhu
Mien Brabeeba Wang
Tiancheng Yu
64
4
0
11 Jun 2020
The Statistical Complexity of Early-Stopped Mirror Descent
The Statistical Complexity of Early-Stopped Mirror Descent
Tomas Vaskevicius
Varun Kanade
Patrick Rebeschini
86
23
0
01 Feb 2020
SlowMo: Improving Communication-Efficient Distributed SGD with Slow
  Momentum
SlowMo: Improving Communication-Efficient Distributed SGD with Slow Momentum
Jianyu Wang
Vinayak Tantia
Nicolas Ballas
Michael G. Rabbat
97
201
0
01 Oct 2019
1