259
v1v2 (latest)

Categorical Reparameterization with Denoising Diffusion models

Samson Gourevitch
Alain Durmus
Eric Moulines
Jimmy Olsson
Yazid Janati
Main:8 Pages
12 Figures
Bibliography:3 Pages
9 Tables
Appendix:23 Pages
Abstract

Learning models with categorical variables requires optimizing expectations over discrete distributions, a setting in which stochastic gradient-based optimization is challenging due to the non-differentiability of categorical sampling. A common workaround is to replace the discrete distribution with a continuous relaxation, yielding a smooth surrogate that admits reparameterized gradient estimates via the reparameterization trick. Building on this idea, we introduce ReDGE, a novel and efficient diffusion-based soft reparameterization method for categorical distributions. Our approach defines a flexible class of gradient estimators that includes the Straight-Through estimator as a special case. Experiments spanning latent variable models and inference-time reward guidance in discrete diffusion models demonstrate that ReDGE consistently matches or outperforms existing gradient-based methods. The code will be made available atthis https URL.

View on arXiv
Comments on this paper