69

Stochastic Optimization in Semi-Discrete Optimal Transport: Convergence Analysis and Minimax Rate

Ferdinand Genans
Antoine Godichon-Baggioni
François-Xavier Vialard
Olivier Wintenberger
Main:9 Pages
2 Figures
Bibliography:3 Pages
1 Tables
Appendix:24 Pages
Abstract

We investigate the semi-discrete Optimal Transport (OT) problem, where a continuous source measure μ\mu is transported to a discrete target measure ν\nu, with particular attention to the OT map approximation. In this setting, Stochastic Gradient Descent (SGD) based solvers have demonstrated strong empirical performance in recent machine learning applications, yet their theoretical guarantee to approximate the OT map is an open question. In this work, we answer it positively by providing both computational and statistical convergence guarantees of SGD. Specifically, we show that SGD methods can estimate the OT map with a minimax convergence rate of O(1/n)\mathcal{O}(1/\sqrt{n}), where nn is the number of samples drawn from μ\mu. To establish this result, we study the averaged projected SGD algorithm, and identify a suitable projection set that contains a minimizer of the objective, even when the source measure is not compactly supported. Our analysis holds under mild assumptions on the source measure and applies to MTW cost functions,whic include p\|\cdot\|^p for p(1,)p \in (1, \infty). We finally provide numerical evidence for our theoretical results.

View on arXiv
Comments on this paper