Manifold learning in Wasserstein space

This paper aims at building the theoretical foundations for manifold learning algorithms in the space of absolutely continuous probability measures with a compact and convex subset of , metrized with the Wasserstein-2 distance . We begin by introducing a construction of submanifolds in equipped with metric , the geodesic restriction of to . In contrast to other constructions, these submanifolds are not necessarily flat, but still allow for local linearizations in a similar fashion to Riemannian submanifolds of . We then show how the latent manifold structure of can be learned from samples of and pairwise extrinsic Wasserstein distances on only. In particular, we show that the metric space can be asymptotically recovered in the sense of Gromov--Wasserstein from a graph with nodes and edge weights . In addition, we demonstrate how the tangent space at a sample can be asymptotically recovered via spectral analysis of a suitable ``covariance operator'' using optimal transport maps from to sufficiently close and diverse samples . The paper closes with some explicit constructions of submanifolds and numerical examples on the recovery of tangent spaces through spectral analysis.
View on arXiv@article{hamm2025_2311.08549, title={ Manifold learning in Wasserstein space }, author={ Keaton Hamm and Caroline Moosmüller and Bernhard Schmitzer and Matthew Thorpe }, journal={arXiv preprint arXiv:2311.08549}, year={ 2025 } }