ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.03076
185
25
v1v2 (latest)

Complexity Analysis of Stein Variational Gradient Descent Under Talagrand's Inequality T1

International Conference on Machine Learning (ICML), 2021
6 June 2021
Adil Salim
Lukang Sun
Peter Richtárik
ArXiv (abs)PDFHTML
Abstract

We study the complexity of Stein Variational Gradient Descent (SVGD), which is an algorithm to sample from π(x)∝exp⁡(−F(x))\pi(x) \propto \exp(-F(x))π(x)∝exp(−F(x)) where FFF smooth and nonconvex. We provide a clean complexity bound for SVGD in the population limit in terms of the Stein Fisher Information (or squared Kernelized Stein Discrepancy), as a function of the dimension of the problem ddd and the desired accuracy ε\varepsilonε. Unlike existing work, we do not make any assumption on the trajectory of the algorithm. Instead, our key assumption is that the target distribution satisfies Talagrand's inequality T1.

View on arXiv
Comments on this paper