ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.04230
  4. Cited By
No MCMC for me: Amortized sampling for fast and stable training of
  energy-based models
v1v2v3 (latest)

No MCMC for me: Amortized sampling for fast and stable training of energy-based models

International Conference on Learning Representations (ICLR), 2020
8 October 2020
Will Grathwohl
Jacob Kelly
Milad Hashemi
Mohammad Norouzi
Kevin Swersky
David Duvenaud
ArXiv (abs)PDFHTML

Papers citing "No MCMC for me: Amortized sampling for fast and stable training of energy-based models"

12 / 62 papers shown
Learning Proposals for Practical Energy-Based Regression
Learning Proposals for Practical Energy-Based RegressionInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2021
L. Kumar
Martin Danelljan
Thomas B. Schon
172
4
0
22 Oct 2021
JEM++: Improved Techniques for Training JEM
JEM++: Improved Techniques for Training JEM
Xiulong Yang
Shihao Ji
AAMLVLM
252
33
0
19 Sep 2021
Energy-Based Open-World Uncertainty Modeling for Confidence Calibration
Energy-Based Open-World Uncertainty Modeling for Confidence CalibrationIEEE International Conference on Computer Vision (ICCV), 2021
Yezhen Wang
Yue Liu
Tong Che
Kaiyang Zhou
Ziwei Liu
Dongsheng Li
UQCV
287
66
0
27 Jul 2021
On Out-of-distribution Detection with Energy-based Models
On Out-of-distribution Detection with Energy-based Models
Sven Elflein
Bertrand Charpentier
Daniel Zügner
Stephan Günnemann
OODD
166
22
0
03 Jul 2021
Conjugate Energy-Based Models
Conjugate Energy-Based ModelsInternational Conference on Machine Learning (ICML), 2021
Hao Wu
Babak Esmaeili
Michael L. Wick
Jean-Baptiste Tristan
Jan-Willem van de Meent
215
2
0
25 Jun 2021
Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
  Kernels
Learning High-Dimensional Distributions with Latent Neural Fokker-Planck Kernels
Jiuxiang Gu
Changyou Chen
Jinhui Xu
208
2
0
10 May 2021
Deep Generative Modelling: A Comparative Review of VAEs, GANs,
  Normalizing Flows, Energy-Based and Autoregressive Models
Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive ModelsIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2021
Sam Bond-Taylor
Adam Leach
Yang Long
Chris G. Willcocks
VLMTPM
724
627
0
08 Mar 2021
Oops I Took A Gradient: Scalable Sampling for Discrete Distributions
Oops I Took A Gradient: Scalable Sampling for Discrete DistributionsInternational Conference on Machine Learning (ICML), 2021
Will Grathwohl
Kevin Swersky
Milad Hashemi
David Duvenaud
Chris J. Maddison
BDL
297
106
0
08 Feb 2021
How to Train Your Energy-Based Models
How to Train Your Energy-Based Models
Yang Song
Diederik P. Kingma
DiffM
364
304
0
09 Jan 2021
Learning Energy-Based Models With Adversarial Training
Learning Energy-Based Models With Adversarial TrainingEuropean Conference on Computer Vision (ECCV), 2020
Xuwang Yin
Shiying Li
Gustavo K. Rohde
AAMLDiffM
412
11
0
11 Dec 2020
Improved Contrastive Divergence Training of Energy Based Models
Improved Contrastive Divergence Training of Energy Based ModelsInternational Conference on Machine Learning (ICML), 2020
Yilun Du
Shuang Li
J. Tenenbaum
Igor Mordatch
642
164
0
02 Dec 2020
MCMC Should Mix: Learning Energy-Based Model with Neural Transport
  Latent Space MCMC
MCMC Should Mix: Learning Energy-Based Model with Neural Transport Latent Space MCMCInternational Conference on Learning Representations (ICLR), 2020
Erik Nijkamp
Ruiqi Gao
Pavel Sountsov
Srinivas Vasudevan
Bo Pang
Song-Chun Zhu
Ying Nian Wu
BDL
199
24
0
12 Jun 2020
Previous
12