ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1508.05243
55
80
v1v2 (latest)

Strong Coresets for Hard and Soft Bregman Clustering with Applications to Exponential Family Mixtures

21 August 2015
Mario Lucic
Olivier Bachem
Andreas Krause
ArXiv (abs)PDFHTML
Abstract

Coresets are efficient representations of datasets such that models trained on a coreset are provably competitive with models trained on the original dataset. As such, they have been successfully used to scale up clustering models such as K-Means and Gaussian mixture models to massive datasets. However, until now, the algorithms and corresponding theory were usually specific to each clustering problem. We propose a single, practical algorithm to construct strong coresets for a large class of hard and soft clustering problems based on Bregman divergences. This class includes hard clustering with popular distortion measures such as the Squared Euclidean distance, the Mahalanobis distance, KL-divergence, Itakura-Saito distance and relative entropy. The corresponding soft clustering problems are directly related to popular mixture models due to a dual relationship between Bregman divergences and Exponential family distributions. Our results recover existing coreset constructions for K-Means and Gaussian mixture models and imply polynomial time approximations schemes for various hard clustering problems.

View on arXiv
Comments on this paper