Linear Partial Gromov-Wasserstein Embedding

The Gromov-Wasserstein (GW) problem, a variant of the classical optimal transport (OT) problem, has attracted growing interest in the machine learning and data science communities due to its ability to quantify similarity between measures in different metric spaces. However, like the classical OT problem, GW imposes an equal mass constraint between measures, which restricts its application in many machine learning tasks. To address this limitation, the partial Gromov-Wasserstein (PGW) problem has been introduced. It relaxes the equal mass constraint, allowing the comparison of general positive Radon measures. Despite this, both GW and PGW face significant computational challenges due to their non-convex nature. To overcome these challenges, we propose the linear partial Gromov-Wasserstein (LPGW) embedding, a linearized embedding technique for the PGW problem. For different metric measure spaces, the pairwise computation of the PGW distance requires solving the PGW problem times. In contrast, the proposed linearization technique reduces this to times. Similar to the linearization technique for the classical OT problem, we prove that LPGW defines a valid metric for metric measure spaces. Finally, we demonstrate the effectiveness of LPGW in practical applications such as shape retrieval and learning with transport-based embeddings, showing that LPGW preserves the advantages of PGW in partial matching while significantly enhancing computational efficiency. The code is available atthis https URL.
View on arXiv@article{bai2025_2410.16669, title={ Linear Partial Gromov-Wasserstein Embedding }, author={ Yikun Bai and Abihith Kothapalli and Hengrong Du and Rocio Diaz Martin and Soheil Kolouri }, journal={arXiv preprint arXiv:2410.16669}, year={ 2025 } }