235
v1v2 (latest)

Entropy-Informed Weighting Channel Normalizing Flow for Deep Generative Models

Main:33 Pages
14 Figures
Bibliography:1 Pages
6 Tables
Appendix:1 Pages
Abstract

Normalizing Flows (NFs) are widely used in deep generative models for their exact likelihood estimation and efficient sampling.However, they require substantial memory since the latent space matches the input dimension.Multi-scale architectures address this by progressively reducing latent dimensions while preserving reversibility.Existing multi-scale architectures use simple, static channel-wise splitting, limiting expressiveness. To improve this, we introduce a regularized, feature-dependent Shuffle\mathtt{Shuffle} operation and integrate it into vanilla multi-scale architecture.This operation adaptively generates channel-wise weights and shuffles latent variables before splitting them.We observe that such operation guides the variables to evolve in the direction of entropy increase, hence we refer to NFs with the Shuffle\mathtt{Shuffle} operation as \emph{Entropy-Informed Weighting Channel Normalizing Flow} (EIW-Flow).Extensive experiments on CIFAR-10, CelebA, ImageNet, and LSUN demonstrate that EIW-Flow achieves state-of-the-art density estimation and competitive sample quality for deep generative modeling, with minimal computational overhead.

View on arXiv
Comments on this paper