381
v1v2 (latest)

Sample and Map from a Single Convex Potential: Generation using Conjugate Moment Measures

Main:8 Pages
17 Figures
Bibliography:3 Pages
5 Tables
Appendix:14 Pages
Abstract

The canonical approach in generative modeling is to split model fitting into two blocks: define first how to sample noise (e.g. Gaussian) and choose next what to do with it (e.g. using a single map or flows). We explore in this work an alternative route that ties sampling and mapping. We find inspiration in moment measures, a result that states that for any measure ρ\rho, there exists a unique convex potential uu such that ρ=ueu\rho=\nabla u \sharp e^{-u}. While this does seem to tie effectively sampling (from log-concave distribution eue^{-u}) and action (pushing particles through u\nabla u), we observe on simple examples (e.g., Gaussians or 1D distributions) that this choice is ill-suited for practical tasks. We study an alternative factorization, where ρ\rho is factorized as wew\nabla w^*\sharp e^{-w}, where ww^* is the convex conjugate of a convex potential ww. We call this approach conjugate moment measures, and show far more intuitive results on these examples. Because w\nabla w^* is the Monge map between the log-concave distribution ewe^{-w} and ρ\rho, we rely on optimal transport solvers to propose an algorithm to recover ww from samples of ρ\rho, and parameterize ww as an input-convex neural network. We also address the common sampling scenario in which the density of ρ\rho is known only up to a normalizing constant, and propose an algorithm to learn ww in this setting.

View on arXiv
Comments on this paper