ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.06060
  4. Cited By
Your GAN is Secretly an Energy-based Model and You Should use
  Discriminator Driven Latent Sampling

Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling

12 March 2020
Tong Che
Ruixiang Zhang
Jascha Narain Sohl-Dickstein
Hugo Larochelle
Liam Paull
Yuan Cao
Yoshua Bengio
    DiffM
    DRL
ArXivPDFHTML

Papers citing "Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling"

20 / 20 papers shown
Title
Deep MMD Gradient Flow without adversarial training
Deep MMD Gradient Flow without adversarial training
Alexandre Galashov
Valentin De Bortoli
Arthur Gretton
DiffM
40
7
0
10 May 2024
Adversarial Botometer: Adversarial Analysis for Social Bot Detection
Adversarial Botometer: Adversarial Analysis for Social Bot Detection
S. Najari
Davood Rafiee
Mostafa Salehi
R. Farahbakhsh
AAML
DeLMO
33
1
0
03 May 2024
Refining Generative Process with Discriminator Guidance in Score-based
  Diffusion Models
Refining Generative Process with Discriminator Guidance in Score-based Diffusion Models
Dongjun Kim
Yeongmin Kim
Se Jung Kwon
Wanmo Kang
Il-Chul Moon
DiffM
33
85
0
28 Nov 2022
Robust and Controllable Object-Centric Learning through Energy-based
  Models
Robust and Controllable Object-Centric Learning through Energy-based Models
Ruixiang Zhang
Tong Che
B. Ivanovic
Renhao Wang
Marco Pavone
Yoshua Bengio
Liam Paull
OCL
28
8
0
11 Oct 2022
Unifying Generative Models with GFlowNets and Beyond
Unifying Generative Models with GFlowNets and Beyond
Dinghuai Zhang
Ricky T. Q. Chen
Nikolay Malkin
Yoshua Bengio
BDL
AI4CE
54
25
0
06 Sep 2022
Diffusion Models: A Comprehensive Survey of Methods and Applications
Diffusion Models: A Comprehensive Survey of Methods and Applications
Ling Yang
Zhilong Zhang
Yingxia Shao
Shenda Hong
Runsheng Xu
Yue Zhao
Wentao Zhang
Bin Cui
Ming-Hsuan Yang
DiffM
MedIm
224
1,302
0
02 Sep 2022
Polarity Sampling: Quality and Diversity Control of Pre-Trained
  Generative Networks via Singular Values
Polarity Sampling: Quality and Diversity Control of Pre-Trained Generative Networks via Singular Values
Ahmed Imtiaz Humayun
Randall Balestriero
Richard Baraniuk
13
31
0
03 Mar 2022
A Generic Approach for Enhancing GANs by Regularized Latent Optimization
A Generic Approach for Enhancing GANs by Regularized Latent Optimization
Yufan Zhou
Chunyuan Li
Changyou Chen
Jinhui Xu
27
0
0
07 Dec 2021
Rebooting ACGAN: Auxiliary Classifier GANs with Stable Training
Rebooting ACGAN: Auxiliary Classifier GANs with Stable Training
Minguk Kang
Woohyeon Shim
Minsu Cho
Jaesik Park
GAN
30
107
0
01 Nov 2021
Bounds all around: training energy-based models with bidirectional
  bounds
Bounds all around: training energy-based models with bidirectional bounds
Cong Geng
Jia Wang
Zhiyong Gao
J. Frellsen
Søren Hauberg
24
15
0
01 Nov 2021
Controllable and Compositional Generation with Latent-Space Energy-Based
  Models
Controllable and Compositional Generation with Latent-Space Energy-Based Models
Weili Nie
Arash Vahdat
Anima Anandkumar
19
78
0
21 Oct 2021
Score-based Generative Modeling in Latent Space
Score-based Generative Modeling in Latent Space
Arash Vahdat
Karsten Kreis
Jan Kautz
DiffM
11
657
0
10 Jun 2021
Latent Space Refinement for Deep Generative Models
Latent Space Refinement for Deep Generative Models
R. Winterhalder
Marco Bellagente
Benjamin Nachman
BDL
GAN
DRL
DiffM
10
27
0
01 Jun 2021
Training GANs with Stronger Augmentations via Contrastive Discriminator
Training GANs with Stronger Augmentations via Contrastive Discriminator
Jongheon Jeong
Jinwoo Shin
14
65
0
17 Mar 2021
NEO: Non Equilibrium Sampling on the Orbit of a Deterministic Transform
NEO: Non Equilibrium Sampling on the Orbit of a Deterministic Transform
Achille Thin
Yazid Janati
Sylvain Le Corff
Charles Ollion
Arnaud Doucet
Alain Durmus
Eric Moulines
C. Robert
20
7
0
17 Mar 2021
Deep Generative Modelling: A Comparative Review of VAEs, GANs,
  Normalizing Flows, Energy-Based and Autoregressive Models
Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive Models
Sam Bond-Taylor
Adam Leach
Yang Long
Chris G. Willcocks
VLM
TPM
36
478
0
08 Mar 2021
EBMs Trained with Maximum Likelihood are Generator Models Trained with a
  Self-adverserial Loss
EBMs Trained with Maximum Likelihood are Generator Models Trained with a Self-adverserial Loss
Zhisheng Xiao
Qing Yan
Y. Amit
29
2
0
23 Feb 2021
Improved Contrastive Divergence Training of Energy Based Models
Improved Contrastive Divergence Training of Energy Based Models
Yilun Du
Shuang Li
J. Tenenbaum
Igor Mordatch
27
138
0
02 Dec 2020
A Neural Network MCMC sampler that maximizes Proposal Entropy
A Neural Network MCMC sampler that maximizes Proposal Entropy
Zengyi Li
Yubei Chen
Friedrich T. Sommer
25
14
0
07 Oct 2020
Denoising Diffusion Probabilistic Models
Denoising Diffusion Probabilistic Models
Jonathan Ho
Ajay Jain
Pieter Abbeel
DiffM
116
16,915
0
19 Jun 2020
1