ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.06944
112
4

DC2DC^2DC2: A Divide-and-conquer Algorithm for Large-scale Kernel Learning with Application to Clustering

16 November 2019
Ke Alexander Wang
Xinran Bian
Pan Liu
Donghui Yan
ArXiv (abs)PDFHTML
Abstract

Divide-and-conquer is a general strategy to deal with large scale problems. It is typically applied to generate ensemble instances, which potentially limits the problem size it can handle. Additionally, the data are often divided by random sampling which may be suboptimal. To address these concerns, we propose the DC2DC^2DC2 algorithm. Instead of ensemble instances, we produce structure-preserving signature pieces to be assembled and conquered. DC2DC^2DC2 achieves the efficiency of sampling-based large scale kernel methods while enabling parallel multicore or clustered computation. The data partition and subsequent compression are unified by recursive random projections. Empirically dividing the data by random projections induces smaller mean squared approximation errors than conventional random sampling. The power of DC2DC^2DC2 is demonstrated by our clustering algorithm rpfCluster+rpfCluster^+rpfCluster+, which is as accurate as some fastest approximate spectral clustering algorithms while maintaining a running time close to that of K-means clustering. Analysis on DC2DC^2DC2 when applied to spectral clustering shows that the loss in clustering accuracy due to data division and reduction is upper bounded by the data approximation error which would vanish with recursive random projections. Due to its easy implementation and flexibility, we expect DC2DC^2DC2 to be applicable to general large scale learning problems.

View on arXiv
Comments on this paper