ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.09771
  4. Cited By
Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees

Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees

21 November 2019
Ruqi Zhang
Christopher De Sa
ArXiv (abs)PDFHTML

Papers citing "Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees"

5 / 5 papers shown
Title
DP-Fast MH: Private, Fast, and Accurate Metropolis-Hastings for
  Large-Scale Bayesian Inference
DP-Fast MH: Private, Fast, and Accurate Metropolis-Hastings for Large-Scale Bayesian Inference
Wanrong Zhang
Ruqi Zhang
65
4
0
10 Mar 2023
Fast Doubly-Adaptive MCMC to Estimate the Gibbs Partition Function with
  Weak Mixing Time Bounds
Fast Doubly-Adaptive MCMC to Estimate the Gibbs Partition Function with Weak Mixing Time Bounds
Shahrzad Haddadan
Zhuang Yue
Cyrus Cousins
E. Upfal
35
7
0
14 Nov 2021
Where Is the Normative Proof? Assumptions and Contradictions in ML
  Fairness Research
Where Is the Normative Proof? Assumptions and Contradictions in ML Fairness Research
A. Feder Cooper
70
7
0
20 Oct 2020
AMAGOLD: Amortized Metropolis Adjustment for Efficient Stochastic
  Gradient MCMC
AMAGOLD: Amortized Metropolis Adjustment for Efficient Stochastic Gradient MCMC
Ruqi Zhang
A. Feder Cooper
Christopher De Sa
81
18
0
29 Feb 2020
Improving Sampling Accuracy of Stochastic Gradient MCMC Methods via
  Non-uniform Subsampling of Gradients
Improving Sampling Accuracy of Stochastic Gradient MCMC Methods via Non-uniform Subsampling of Gradients
Ruilin Li
Xin Wang
H. Zha
Molei Tao
34
4
0
20 Feb 2020
1