42
0

Fast Sampling for Bayesian Max-Margin Models

Jun Zhu
Bo Zhang
Abstract

Bayesian max-margin models have shown great superiority in various machine learning tasks with a likelihood regularization, while the probabilistic Monte Carlo sampling for these models still remains challenging, especially for large-scale settings. In analogy to the data augmentation technique to tackle with non-smoothness of the hinge loss, we present a stochastic subgradient MCMC method which is easy to implement and computationally efficient. We investigate the variants that use adaptive stepsizes and thermostats to improve mixing speeds for Bayesian linear SVM. Furthermore, we design a stochastic subgradient HMC within Gibbs method and a doubly stochastic HMC algorithm for mixture of SVMs, a popular extension of linear classifiers. Experimental results on a wide range of problems demonstrate the effectiveness of our approach.

View on arXiv
Comments on this paper