ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.13706
23
0

Non-convex sampling for a mixture of locally smooth potentials

31 January 2023
D. Nguyen
ArXivPDFHTML
Abstract

The purpose of this paper is to examine the sampling problem through Euler discretization, where the potential function is assumed to be a mixture of locally smooth distributions and weakly dissipative. We introduce αG\alpha_{G}αG​-mixture locally smooth and αH\alpha_{H}αH​-mixture locally Hessian smooth, which are novel and typically satisfied with a mixture of distributions. Under our conditions, we prove the convergence in Kullback-Leibler (KL) divergence with the number of iterations to reach ϵ\epsilonϵ-neighborhood of a target distribution in only polynomial dependence on the dimension. The convergence rate is improved when the potential is 111-smooth and αH\alpha_{H}αH​-mixture locally Hessian smooth. Our result for the non-strongly convex outside the ball of radius RRR is obtained by convexifying the non-convex domains. In addition, we provide some nice theoretical properties of ppp-generalized Gaussian smoothing and prove the convergence in the LβL_{\beta}Lβ​-Wasserstein distance for stochastic gradients in a general setting.

View on arXiv
Comments on this paper