ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.17505
26
0

Sequential Kernelized Stein Discrepancy

26 September 2024
Diego Martinez-Taboada
Aaditya Ramdas
ArXivPDFHTML
Abstract

We present a sequential version of the kernelized Stein discrepancy goodness-of-fit test, which allows for conducting goodness-of-fit tests for unnormalized densities that are continuously monitored and adaptively stopped. That is, the sample size need not be fixed prior to data collection; the practitioner can choose whether to stop the test or continue to gather evidence at any time while controlling the false discovery rate. In stark contrast to related literature, we do not impose uniform boundedness on the Stein kernel. Instead, we exploit the potential boundedness of the Stein kernel at arbitrary point evaluations to define test martingales, that give way to the subsequent novel sequential tests. We prove the validity of the test, as well as an asymptotic lower bound for the logarithmic growth of the wealth process under the alternative. We further illustrate the empirical performance of the test with a variety of distributions, including restricted Boltzmann machines.

View on arXiv
@article{martinez-taboada2025_2409.17505,
  title={ Sequential Kernelized Stein Discrepancy },
  author={ Diego Martinez-Taboada and Aaditya Ramdas },
  journal={arXiv preprint arXiv:2409.17505},
  year={ 2025 }
}
Comments on this paper