43
0

Scalable Expectation Estimation with Subtractive Mixture Models

Abstract

Many Monte Carlo (MC) and importance sampling (IS) methods use mixture models (MMs) for their simplicity and ability to capture multimodal distributions. Recently, subtractive mixture models (SMMs), i.e. MMs with negative coefficients, have shown greater expressiveness and success in generative modeling. However, their negative parameters complicate sampling, requiring costly auto-regressive techniques or accept-reject algorithms that do not scale in high dimensions. In this work, we use the difference representation of SMMs to construct an unbiased IS estimator (ΔEx\Delta\text{Ex}) that removes the need to sample from the SMM, enabling high-dimensional expectation estimation with SMMs. In our experiments, we show that ΔEx\Delta\text{Ex} can achieve comparable estimation quality to auto-regressive sampling while being considerably faster in MC estimation. Moreover, we conduct initial experiments with ΔEx\Delta\text{Ex} using hand-crafted proposals, gaining first insights into how to construct safe proposals for ΔEx\Delta\text{Ex}.

View on arXiv
@article{zellinger2025_2503.21346,
  title={ Scalable Expectation Estimation with Subtractive Mixture Models },
  author={ Lena Zellinger and Nicola Branchini and Víctor Elvira and Antonio Vergari },
  journal={arXiv preprint arXiv:2503.21346},
  year={ 2025 }
}
Comments on this paper