Sinkhorn Distributionally Robust Optimization

Abstract
We study distributionally robust optimization with Sinkhorn distance -- a variant of Wasserstein distance based on entropic regularization. We derive a convex programming dual reformulation for general nominal distributions, transport costs, and loss functions. To solve the dual reformulation, we develop a stochastic mirror descent algorithm with biased subgradient estimators and derive its computational complexity guarantees. Finally, we provide numerical examples using synthetic and real data to demonstrate its superior performance.
View on arXiv@article{wang2025_2109.11926, title={ Sinkhorn Distributionally Robust Optimization }, author={ Jie Wang and Rui Gao and Yao Xie }, journal={arXiv preprint arXiv:2109.11926}, year={ 2025 } }
Comments on this paper