ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.10174
91
85
v1v2v3v4v5 (latest)

Mirrored Langevin Dynamics

27 February 2018
Ya-Ping Hsieh
Ali Kavis
Paul Rolland
Volkan Cevher
ArXiv (abs)PDFHTML
Abstract

We consider the problem of sampling from constrained distributions, which has posed significant challenges to both non-asymptotic analysis and algorithmic design. We propose a unified framework, which is inspired by the classical mirror descent, to derive novel first-order sampling schemes. We prove that, for a general target distribution with strongly convex potential, our framework implies the existence of a first-order algorithm achieving O~(ϵ−2d)\tilde{O}(\epsilon^{-2}d)O~(ϵ−2d) convergence, suggesting that the state-of-the-art O~(ϵ−6d5)\tilde{O}(\epsilon^{-6}d^5)O~(ϵ−6d5) can be vastly improved. With the important Latent Dirichlet Allocation (LDA) application in mind, we specialize our algorithm to sample from Dirichlet posteriors, and derive the first non-asymptotic O~(ϵ−2d2)\tilde{O}(\epsilon^{-2}d^2)O~(ϵ−2d2) rate for first-order sampling. We further extend our framework to the mini-batch setting and prove convergence rates when only stochastic gradients are available. Finally, we report promising experimental results for LDA on real datasets.

View on arXiv
Comments on this paper