ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.04509
  4. Cited By
Oops I Took A Gradient: Scalable Sampling for Discrete Distributions

Oops I Took A Gradient: Scalable Sampling for Discrete Distributions

8 February 2021
Will Grathwohl
Kevin Swersky
Milad Hashemi
David Duvenaud
Chris J. Maddison
    BDL
ArXivPDFHTML

Papers citing "Oops I Took A Gradient: Scalable Sampling for Discrete Distributions"

11 / 11 papers shown
Title
Discrete Neural Flow Samplers with Locally Equivariant Transformer
Discrete Neural Flow Samplers with Locally Equivariant Transformer
Zijing Ou
Ruixiang Zhang
Yingzhen Li
22
0
0
23 May 2025
Guide your favorite protein sequence generative model
Guide your favorite protein sequence generative model
Junhao Xiong
Hunter Nisonoff
Maria Lukarska
Ishan Gaur
Luke M. Oltrogge
David F. Savage
Jennifer Listgarten
DiffM
87
1
0
07 May 2025
Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling
Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling
Michal Balcerak
Tamaz Amiranashvili
Suprosanna Shit
Antonio Terpin
Lea Bogensperger
Sebastian Kaltenbach
Petros Koumoutsakos
Bjoern Menze
DiffM
86
2
0
14 Apr 2025
Enhancing Gradient-based Discrete Sampling via Parallel Tempering
Enhancing Gradient-based Discrete Sampling via Parallel Tempering
Luxu Liang
Yuhang Jia
Feng Zhou
77
0
0
26 Feb 2025
Unlocking Guidance for Discrete State-Space Diffusion and Flow Models
Unlocking Guidance for Discrete State-Space Diffusion and Flow Models
Hunter Nisonoff
Junhao Xiong
Stephan Allenspach
Jennifer Listgarten
78
37
0
03 Jun 2024
No MCMC for me: Amortized sampling for fast and stable training of
  energy-based models
No MCMC for me: Amortized sampling for fast and stable training of energy-based models
Will Grathwohl
Jacob Kelly
Milad Hashemi
Mohammad Norouzi
Kevin Swersky
David Duvenaud
41
71
0
08 Oct 2020
Your Classifier is Secretly an Energy Based Model and You Should Treat
  it Like One
Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One
Will Grathwohl
Kuan-Chieh Wang
J. Jacobsen
David Duvenaud
Mohammad Norouzi
Kevin Swersky
VLM
50
536
0
06 Dec 2019
Generative Modeling by Estimating Gradients of the Data Distribution
Generative Modeling by Estimating Gradients of the Data Distribution
Yang Song
Stefano Ermon
SyDa
DiffM
121
3,803
0
12 Jul 2019
On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based
  Models
On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models
Erik Nijkamp
Mitch Hill
Tian Han
Song-Chun Zhu
Ying Nian Wu
25
154
0
29 Mar 2019
Stein Variational Gradient Descent: A General Purpose Bayesian Inference
  Algorithm
Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
Qiang Liu
Dilin Wang
BDL
42
1,082
0
16 Aug 2016
Adaptive Gibbs samplers and related MCMC methods
Adaptive Gibbs samplers and related MCMC methods
K. Latuszyñski
Gareth O. Roberts
Jeffrey S. Rosenthal
65
88
0
31 Jan 2011
1