ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.03267
19
1

Estimating 2-Sinkhorn Divergence between Gaussian Processes from Finite-Dimensional Marginals

5 February 2021
Anton Mallasto
    OT
ArXivPDFHTML
Abstract

\emph{Optimal Transport} (OT) has emerged as an important computational tool in machine learning and computer vision, providing a geometrical framework for studying probability measures. OT unfortunately suffers from the curse of dimensionality and requires regularization for practical computations, of which the \emph{entropic regularization} is a popular choice, which can be únbiased', resulting in a \emph{Sinkhorn divergence}. In this work, we study the convergence of estimating the 2-Sinkhorn divergence between \emph{Gaussian processes} (GPs) using their finite-dimensional marginal distributions. We show almost sure convergence of the divergence when the marginals are sampled according to some base measure. Furthermore, we show that using nnn marginals the estimation error of the divergence scales in a dimension-free way as O(ϵ−1n−12)\mathcal{O}\left(\epsilon^ {-1}n^{-\frac{1}{2}}\right)O(ϵ−1n−21​), where ϵ\epsilonϵ is the magnitude of entropic regularization.

View on arXiv
Comments on this paper