15

Rate-Optimal Noise Annealing in Semi-Dual Neural Optimal Transport: Tangential Identifiability, Off-Manifold Ambiguity, and Guaranteed Recovery

Raymond Chu
Jaewoong Choi
Dohyun Kwon
Main:8 Pages
5 Figures
Bibliography:2 Pages
6 Tables
Appendix:14 Pages
Abstract

Semi-dual neural optimal transport learns a transport map via a max-min objective, yet training can converge to incorrect or degenerate maps. We fully characterize these spurious solutions in the common regime where data concentrate on low-dimensional manifold: the objective is underconstrained off the data manifold, while the on-manifold transport signal remains identifiable. Following Choi, Choi, and Kwon (2025), we study additive-noise smoothing as a remedy and prove new map recovery guarantees as the noise vanishes. Our main practical contribution is a computable terminal noise level εstat(N)\varepsilon_{\mathrm{stat}}(N) that attains the optimal statistical rate, with scaling governed by the intrinsic dimension mm of the data. The formula arises from a theoretical unified analysis of (i) quantitative stability of optimal plans, (ii) smoothing-induced bias, and (iii) finite-sample error, yielding rates that depend on mm rather than the ambient dimension. Finally, we show that the reduced semi-dual objective becomes increasingly ill-conditioned as ε0\varepsilon \downarrow 0. This provides a principled stopping rule: annealing below εstat(N)\varepsilon_{\mathrm{stat}}(N) can worsen\textit{worsen} optimization conditioning without improving statistical accuracy.

View on arXiv
Comments on this paper