ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1705.02127
41
8
v1v2 (latest)

A Note on Hardness of Diameter Approximation

5 May 2017
K. Bringmann
Sebastian Krinninger
ArXiv (abs)PDFHTML
Abstract

We revisit the hardness of approximating the diameter of a network. In the CONGEST model, Ω~(n) \tilde \Omega (n) Ω~(n) rounds are necessary to compute the diameter [Frischknecht et al. SODA'12]. Abboud et al. DISC 2016 extended this result to sparse graphs and, at a more fine-grained level, showed that, for any integer 1≤ℓ≤polylog⁡(n) 1 \leq \ell \leq \operatorname{polylog} (n) 1≤ℓ≤polylog(n), distinguishing between networks of diameter 4ℓ+2 4 \ell + 2 4ℓ+2 and 6ℓ+1 6 \ell + 1 6ℓ+1 requires Ω~(n) \tilde \Omega (n) Ω~(n) rounds. We slightly tighten this result by showing that even distinguishing between diameter 2ℓ+1 2 \ell + 1 2ℓ+1 and 3ℓ+1 3 \ell + 1 3ℓ+1 requires Ω~(n) \tilde \Omega (n) Ω~(n) rounds. The reduction of Abboud et al. is inspired by recent conditional lower bounds in the RAM model, where the orthogonal vectors problem plays a pivotal role. In our new lower bound, we make the connection to orthogonal vectors explicit, leading to a conceptually more streamlined exposition. This is suited for teaching both the lower bound in the CONGEST model and the conditional lower bound in the RAM model.

View on arXiv
Comments on this paper