The purpose of this paper is to examine the sampling problem through Euler discretization, where the potential function is assumed to be a mixture of locally smooth distributions and weakly dissipative. We introduce -mixture locally smooth and -mixture locally Hessian smooth, which are novel and typically satisfied with a mixture of distributions. Under our conditions, we prove the convergence in Kullback-Leibler (KL) divergence with the number of iterations to reach -neighborhood of a target distribution in only polynomial dependence on the dimension. The convergence rate is improved when the potential is -smooth and -mixture locally Hessian smooth. Our result for the non-strongly convex outside the ball of radius is obtained by convexifying the non-convex domains. In addition, we provide some nice theoretical properties of -generalized Gaussian smoothing and prove the convergence in the -Wasserstein distance for stochastic gradients in a general setting.
View on arXiv