32
12

Contractive Diffusion Probabilistic Models

Abstract

Diffusion probabilistic models (DPMs) have emerged as a promising technique in generative modeling. The success of DPMs relies on two ingredients: time reversal of diffusion processes and score matching. Most existing works implicitly assume that score matching is close to perfect, while this assumption is questionable. In view of possibly unguaranteed score matching, we propose a new criterion -- the contraction of backward sampling in the design of DPMs, leading to a novel class of contractive DPMs (CDPMs). The key insight is that the contraction in the backward process can narrow score matching errors and discretization errors. Thus, our proposed CDPMs are robust to both sources of error. For practical use, we show that CDPM can leverage pretrained DPMs by a simple transformation, and does not need retraining. We corroborated our approach by experiments on synthetic 1-dim examples, Swiss Roll, MNIST, CIFAR-10 32×\times32 and AFHQ 64×\times64 dataset. Notably, CDPM shows the best performance among all known SDE-based DPMs.

View on arXiv
Comments on this paper