ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2509.23092
16
0

Sensitivity Analysis for Diffusion Models

27 September 2025
Christopher Scarvelis
Justin Solomon
    DiffM
ArXiv (abs)PDFHTMLGithub (2646★)
Main:9 Pages
10 Figures
Bibliography:2 Pages
Appendix:8 Pages
Abstract

Training a diffusion model approximates a map from a data distribution ρ\rhoρ to the optimal score function sts_tst​ for that distribution. Can we differentiate this map? If we could, then we could predict how the score, and ultimately the model's samples, would change under small perturbations to the training set before committing to costly retraining. We give a closed-form procedure for computing this map's directional derivatives, relying only on black-box access to a pre-trained score model and its derivatives with respect to its inputs. We extend this result to estimate the sensitivity of a diffusion model's samples to additive perturbations of its target measure, with runtime comparable to sampling from a diffusion model and computing log-likelihoods along the sample path. Our method is robust to numerical and approximation error, and the resulting sensitivities correlate with changes in an image diffusion model's samples after retraining and fine-tuning.

View on arXiv
Comments on this paper