ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.03215
64
5
v1v2v3v4v5v6v7 (latest)

Quantitative W1W_1W1​ Convergence of Langevin-Like Stochastic Processes with Non-Convex Potential State-Dependent Noise

7 July 2019
Xiang Cheng
Dong Yin
Peter L. Bartlett
Michael I. Jordan
ArXiv (abs)PDFHTML
Abstract

We prove quantitative convergence rates at which discrete Langevin-like processes converge to the invariant distribution of a related stochastic differential equation. We study the setup where the additive noise can be non-Gaussian and state-dependent and the potential function can be non-convex. We show that the key properties of these processes depend on the potential function and the second moment of the additive noise. We apply our theoretical findings to studying the convergence of Stochastic Gradient Descent (SGD) for non-convex problems and corroborate them with experiments using SGD to train deep neural networks on the CIFAR-10 dataset.

View on arXiv
Comments on this paper