ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1908.06514
29
2
v1v2 (latest)

Revisiting the balance heuristic for estimating normalising constants

18 August 2019
F. Medina-Aguayo
R. Everitt
ArXiv (abs)PDFHTML
Abstract

Multiple importance sampling estimators are widely used for computing intractable constants due to its reliability and robustness. The celebrated balance heuristic estimator belongs to this class of methods and has proved very successful in computer graphics. The basic ingredients for computing the estimator are: a set of proposal distributions, indexed by some discrete label, and a predetermined number of draws from each of these proposals. However, if the number of available proposals is much larger than the number of permitted importance points, one needs to select, possibly at random, which of these distributions will be used. The focus of this work lies within the previous context, exploring some improvements and variations of the balance heuristic via a novel extended-space representation of the estimator, leading to straightforward annealing schemes for variance reduction purposes. In addition, we also look at the intractable scenario where the proposal density is only available as a joint function with the discrete label, as may be encountered in problems where an ordering is imposed. For this case, we look at combinations of correlated unbiased estimators which also fit into the extended-space representation and, in turn, will provide other interesting solutions.

View on arXiv
Comments on this paper