13
0

Consistency of Learned Sparse Grid Quadrature Rules using NeuralODEs

Hanno Gottschalk
Emil Partow
Tobias J. Riedlinger
Main:27 Pages
1 Figures
Bibliography:3 Pages
Abstract

This paper provides a proof of the consistency of sparse grid quadrature for numerical integration of high dimensional distributions. In a first step, a transport map is learned that normalizes the distribution to a noise distribution on the unit cube. This step is built on the statistical learning theory of neural ordinary differential equations, which has been established recently. Secondly, the composition of the generative map with the quantity of interest is integrated numerically using the Clenshaw-Curtis sparse grid quadrature. A decomposition of the total numerical error in quadrature error and statistical error is provided. As main result it is proven in the framework of empirical risk minimization that all error terms can be controlled in the sense of PAC (probably approximately correct) learning and with high probability the numerical integral approximates the theoretical value up to an arbitrary small error in the limit where the data set size is growing and the network capacity is increased adaptively.

View on arXiv
@article{gottschalk2025_2507.01533,
  title={ Consistency of Learned Sparse Grid Quadrature Rules using NeuralODEs },
  author={ Hanno Gottschalk and Emil Partow and Tobias J. Riedlinger },
  journal={arXiv preprint arXiv:2507.01533},
  year={ 2025 }
}
Comments on this paper