268

A quasi-polynomial time algorithm for Multi-Dimensional Scaling via LP hierarchies

Annual Conference Computational Learning Theory (COLT), 2023
Main:33 Pages
4 Figures
Bibliography:5 Pages
Abstract

Multi-dimensional Scaling (MDS) is a family of methods for embedding pair-wise dissimilarities between nn objects into low-dimensional space. MDS is widely used as a data visualization tool in the social and biological sciences, statistics, and machine learning. We study the Kamada-Kawai formulation of MDS: given a set of non-negative dissimilarities {di,j}i,j[n]\{d_{i,j}\}_{i , j \in [n]} over nn points, the goal is to find an embedding {x1,,xn}Rk\{x_1,\dots,x_n\} \subset \mathbb{R}^k that minimizes \[ \text{OPT} = \min_{x} \mathbb{E}_{i,j \in [n]} \left[ \left(1-\frac{\|x_i - x_j\|}{d_{i,j}}\right)^2 \right] \] Despite its popularity, our theoretical understanding of MDS is extremely limited. Recently, Demaine, Hesterberg, Koehler, Lynch, and Urschel (arXiv:2109.11505) gave the first approximation algorithm with provable guarantees for Kamada-Kawai, which achieves an embedding with cost OPT+ϵ\text{OPT} +\epsilon in n22O~(kΔ4/ϵ2)n^2 \cdot 2^{\tilde{\mathcal{O}}(k \Delta^4 / \epsilon^2)} time, where Δ\Delta is the aspect ratio of the input dissimilarities. In this work, we give the first approximation algorithm for MDS with quasi-polynomial dependency on Δ\Delta: for target dimension kk, we achieve a solution with cost O(OPT1/klog(Δ/ϵ))+ϵ\mathcal{O}(\text{OPT}^{ \hspace{0.04in}1/k } \cdot \log(\Delta/\epsilon) )+ \epsilon in time nO(1)2O~(k2(log(Δ)/ϵ)k/2+1)n^{ \mathcal{O}(1)} \cdot 2^{\tilde{\mathcal{O}}( k^2 (\log(\Delta)/\epsilon)^{k/2 + 1} ) }. Our approach is based on a novel analysis of a conditioning-based rounding scheme for the Sherali-Adams LP Hierarchy. Crucially, our analysis exploits the geometry of low-dimensional Euclidean space, allowing us to avoid an exponential dependence on the aspect ratio Δ\Delta. We believe our geometry-aware treatment of the Sherali-Adams Hierarchy is an important step towards developing general-purpose techniques for efficient metric optimization algorithms.

View on arXiv
Comments on this paper