181

Complexity Analysis of Stein Variational Gradient Descent Under Talagrand's Inequality T1

International Conference on Machine Learning (ICML), 2021
Abstract

We study the complexity of Stein Variational Gradient Descent (SVGD), which is an algorithm to sample from π(x)exp(F(x))\pi(x) \propto \exp(-F(x)) where FF smooth and nonconvex. We provide a clean complexity bound for SVGD in the population limit in terms of the Stein Fisher Information (or squared Kernelized Stein Discrepancy), as a function of the dimension of the problem dd and the desired accuracy ε\varepsilon. Unlike existing work, we do not make any assumption on the trajectory of the algorithm. Instead, our key assumption is that the target distribution satisfies Talagrand's inequality T1.

View on arXiv
Comments on this paper