Consider an empirical measure induced by iid samples from a -dimensional -subgaussian distribution and let be the isotropic Gaussian measure. We study the speed of convergence of the smoothed Wasserstein distance with being the convolution of measures. For and in any dimension we show that . For in dimension we show that the rate is slower and is given by . This resolves several open problems in \cite{goldfeld2020convergence}, and in particular precisely identifies the amount of smoothing needed to obtain a parametric rate. In addition, we also establish that has rate for but only slows down to for . The surprising difference of the behavior of and KL implies the failure of -transportation inequality when . Consequently, the requirement is necessary for validity of the log-Sobolev inequality (LSI) for the Gaussian mixture , closing an open problem in \cite{wang2016functional}, who established the LSI under precisely this condition.
View on arXiv