ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.03155
48
3
v1v2v3v4v5 (latest)

Distributed Learning with Random Features

7 June 2019
Jian Li
Yong Liu
Weiping Wang
ArXiv (abs)PDFHTML
Abstract

Distributed learning and random projections are the most common techniques in large scale nonparametric statistical learning. In this paper, we study the generalization properties of kernel ridge regression using both distributed methods and random features. Theoretical analysis shows the combination remarkably reduces computational cost while preserving the optimal generalization accuracy under standard assumptions. In a benign case, O(N)\mathcal{O}(\sqrt{N})O(N​) partitions and O(N)\mathcal{O}(\sqrt{N})O(N​) random features are sufficient to achieve O(1/N)\mathcal{O}(1/N)O(1/N) learning rate, where NNN is the labeled sample size. Further, we derive more refined results by using additional unlabeled data to enlarge the number of partitions and by generating features in a data-dependent way to reduce the number of random features.

View on arXiv
Comments on this paper