ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.13918
13
1

U-Statistics for Importance-Weighted Variational Inference

27 February 2023
Javier Burroni
Kenta Takatsu
Justin Domke
Daniel Sheldon
ArXivPDFHTML
Abstract

We propose the use of U-statistics to reduce variance for gradient estimation in importance-weighted variational inference. The key observation is that, given a base gradient estimator that requires m>1m > 1m>1 samples and a total of n>mn > mn>m samples to be used for estimation, lower variance is achieved by averaging the base estimator on overlapping batches of size mmm than disjoint batches, as currently done. We use classical U-statistic theory to analyze the variance reduction, and propose novel approximations with theoretical guarantees to ensure computational efficiency. We find empirically that U-statistic variance reduction can lead to modest to significant improvements in inference performance on a range of models, with little computational cost.

View on arXiv
Comments on this paper