Complexity Analysis of Stein Variational Gradient Descent Under
Talagrand's Inequality T1
International Conference on Machine Learning (ICML), 2021

Abstract
We study the complexity of Stein Variational Gradient Descent (SVGD), which is an algorithm to sample from where smooth and nonconvex. We provide a clean complexity bound for SVGD in the population limit in terms of the Stein Fisher Information (or squared Kernelized Stein Discrepancy), as a function of the dimension of the problem and the desired accuracy . Unlike existing work, we do not make any assumption on the trajectory of the algorithm. Instead, our key assumption is that the target distribution satisfies Talagrand's inequality T1.
View on arXivComments on this paper
