Metric Embeddings Beyond Bi-Lipschitz Distortion via Sherali-Adams
Ainesh Bakshi, Vincent Cohen-Addad, Samuel B. Hopkins, Rajesh Jayaram, Silvio Lattanzi
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Metric embeddings are a widely used method in algorithm design, where generally a ``complex'' metric is embedded into a simpler, lower-dimensional one. Historically, the theoretical computer science community has focused on bi-Lipschitz embeddings, which guarantee that every pairwise distance is approximately preserved. In contrast, alternative embedding objectives that are commonly used in practice avoid bi-Lipschitz distortion; yet these approaches have received comparatively less study in theory. In this paper, we focus on Multi-dimensional Scaling (MDS), where we are given a set of non-negative dissimilarities _i,j\_i,j [n] over n points, and the goal is to find an embedding _1,,x_n\ R^k that minimizes Despite its popularity, our theoretical understanding of MDS is extremely limited. Recently, Demaine et. al. (arXiv:2109.11505) gave the first approximation algorithm with provable guarantees for this objective, which achieves an embedding in constant dimensional Euclidean space with cost OPT + in n^2 2^poly(/) time, where is the aspect ratio of the input dissimilarities. For metrics that admit low-cost embeddings, scales polynomially in n. In this work, we give the first approximation algorithm for MDS with quasi-polynomial dependency on : for constant dimensional Euclidean space, we achieve a solution with cost O( ) OPT^(1)+ in time n^O(1) 2^poly((()/)). Our algorithms are based on a novel geometry-aware analysis of a conditional rounding of the Sherali-Adams LP Hierarchy, allowing us to avoid exponential dependency on the aspect ratio, which would typically result from this rounding.