ScoreFusion: Fusing Score-based Generative Models via Kullback-Leibler Barycenters

We introduce ScoreFusion, a theoretically grounded method for fusing multiple pre-trained diffusion models that are assumed to generate from auxiliary populations. ScoreFusion is particularly useful for enhancing the generative modeling of a target population with limited observed data. Our starting point considers the family of KL barycenters of the auxiliary populations, which is proven to be an optimal parametric class in the KL sense, but difficult to learn. Nevertheless, by recasting the learning problem as score matching in denoising diffusion, we obtain a tractable way of computing the optimal KL barycenter weights. We prove a dimension-free sample complexity bound in total variation distance, provided that the auxiliary models are well-fitted for their own task and the auxiliary tasks combined capture the target well. The sample efficiency of ScoreFusion is demonstrated by learning handwritten digits. We also provide a simple adaptation of a Stable Diffusion denoising pipeline that enables sampling from the KL barycenter of two auxiliary checkpoints; on a portrait generation task, our method produces faces that enhance population heterogeneity relative to the auxiliary distributions.
View on arXiv@article{liu2025_2406.19619, title={ ScoreFusion: Fusing Score-based Generative Models via Kullback-Leibler Barycenters }, author={ Hao Liu and Junze Tony Ye and Jose Blanchet and Nian Si }, journal={arXiv preprint arXiv:2406.19619}, year={ 2025 } }