This paper provides a unifying view of optimal kernel hypothesis testing across the MMD two-sample, HSIC independence, and KSD goodness-of-fit frameworks. Minimax optimal separation rates in the kernel and metrics are presented, with two adaptive kernel selection methods (kernel pooling and aggregation), and under various testing constraints: computational efficiency, differential privacy, and robustness to data corruption. Intuition behind the derivation of the power results is provided in a unified way accross the three frameworks, and open problems are highlighted.
View on arXiv@article{schrab2025_2503.07084, title={ A Unified View of Optimal Kernel Hypothesis Testing }, author={ Antonin Schrab }, journal={arXiv preprint arXiv:2503.07084}, year={ 2025 } }