ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.04609
95
0

Exploring bidirectional bounds for minimax-training of Energy-based models

5 June 2025
Cong Geng
Jia Wang
Li Chen
Zhiyong Gao
J. Frellsen
Søren Hauberg
ArXiv (abs)PDFHTML
Abstract

Energy-based models (EBMs) estimate unnormalized densities in an elegant framework, but they are generally difficult to train. Recent work has linked EBMs to generative adversarial networks, by noting that they can be trained through a minimax game using a variational lower bound. To avoid the instabilities caused by minimizing a lower bound, we propose to instead work with bidirectional bounds, meaning that we maximize a lower bound and minimize an upper bound when training the EBM. We investigate four different bounds on the log-likelihood derived from different perspectives. We derive lower bounds based on the singular values of the generator Jacobian and on mutual information. To upper bound the negative log-likelihood, we consider a gradient penalty-like bound, as well as one based on diffusion processes. In all cases, we provide algorithms for evaluating the bounds. We compare the different bounds to investigate, the pros and cons of the different approaches. Finally, we demonstrate that the use of bidirectional bounds stabilizes EBM training and yields high-quality density estimation and sample generation.

View on arXiv
@article{geng2025_2506.04609,
  title={ Exploring bidirectional bounds for minimax-training of Energy-based models },
  author={ Cong Geng and Jia Wang and Li Chen and Zhiyong Gao and Jes Frellsen and Søren Hauberg },
  journal={arXiv preprint arXiv:2506.04609},
  year={ 2025 }
}
Main:19 Pages
5 Figures
Bibliography:7 Pages
6 Tables
Comments on this paper