ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1504.07107
42
0
v1v2v3v4v5 (latest)

Stochastic Subgradient MCMC Methods

27 April 2015
Wenbo Hu
Jun Zhu
Bo Zhang
    BDL
ArXiv (abs)PDFHTML
Abstract

Many Bayesian models involve continuous but non-differentiable log-posteriors, including the sparse Bayesian methods with a Laplace prior and the regularized Bayesian methods with max-margin posterior regularization that acts like a likelihood term. In analogy to the popular stochastic subgradient methods for deterministic optimization, we present the stochastic subgradient MCMC for efficient posterior inference in such Bayesian models in order to deal with large-scale applications. We investigate the variants that use adaptive stepsizes and thermostats to improve mixing speeds. Experimental results on a wide range of problems demonstrate the effectiveness of our approach.

View on arXiv
Comments on this paper