ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.04646
110
0

Importance Sampling via Score-based Generative Models

7 February 2025
Heasung Kim
Taekyun Lee
Hyeji Kim
Gustavo de Veciana
    MedIm
    DiffM
ArXivPDFHTML
Abstract

Importance sampling, which involves sampling from a probability density function (PDF) proportional to the product of an importance weight function and a base PDF, is a powerful technique with applications in variance reduction, biased or customized sampling, data augmentation, and beyond. Inspired by the growing availability of score-based generative models (SGMs), we propose an entirely training-free Importance sampling framework that relies solely on an SGM for the base PDF. Our key innovation is realizing the importance sampling process as a backward diffusion process, expressed in terms of the score function of the base PDF and the specified importance weight function--both readily available--eliminating the need for any additional training. We conduct a thorough analysis demonstrating the method's scalability and effectiveness across diverse datasets and tasks, including importance sampling for industrial and natural images with neural importance weight functions. The training-free aspect of our method is particularly compelling in real-world scenarios where a single base distribution underlies multiple biased sampling tasks, each requiring a different importance weight function. To the best of our knowledge our approach is the first importance sampling framework to achieve this.

View on arXiv
@article{kim2025_2502.04646,
  title={ Importance Sampling via Score-based Generative Models },
  author={ Heasung Kim and Taekyun Lee and Hyeji Kim and Gustavo de Veciana },
  journal={arXiv preprint arXiv:2502.04646},
  year={ 2025 }
}
Comments on this paper