ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1604.05251
86
93
v1v2 (latest)

Kernel Distribution Embeddings: Universal Kernels, Characteristic Kernels and Kernel Metrics on Distributions

18 April 2016
Carl-Johann Simon-Gabriel
Bernhard Schölkopf
ArXiv (abs)PDFHTML
Abstract

Kernel mean embeddings have recently attracted the attention of the machine learning community. They map measures μ\muμ from some set MMM to functions in a reproducing kernel Hilbert space (RKHS) with kernel kkk. The RKHS distance of two mapped measures is a semi-metric dkd_kdk​ over MMM. We study three questions. (I) For a given kernel, what sets MMM can be embedded? (II) When is the embedding injective over MMM (in which case dkd_kdk​ is a metric)? (III) How does the dkd_kdk​-induced topology compare to other topologies on MMM? The existing machine learning literature has addressed these questions in cases where MMM is (a subset of) the finite regular Borel measures. We unify, improve and generalise those results. Our approach naturally leads to continuous and possibly even injective embeddings of (Schwartz-) distributions, i.e., generalised measures, but the reader is free to focus on measures only. In particular, we systemise and extend various (partly known) equivalences between different notions of universal, characteristic and strictly positive definite kernels, and show that on an underlying locally compact Hausdorff space, dkd_kdk​ metrises the weak convergence of probability measures if and only if kkk is continuous and characteristic.

View on arXiv
Comments on this paper