Distributed Minimum Cut Approximation
We study the problem of computing approximate minimum edge cuts by distributed algorithms. We present two randomized approximation algorithms that both run in a standard synchronous message passing model where in each round, bits can be transmitted over every edge (a.k.a. the CONGEST model). The first algorithm is based on a simple and new approach for analyzing random edge sampling, which we call random layering technique. For any any weighted graph and any , the algorithm finds a cut of size at most in rounds, where is the minimum-cut size and the -notation hides poly-logarithmic factors in . In addition, using the outline of a centralized algorithm due to Matula [SODA '93], we present a randomized algorithm to compute a cut of size at most in rounds for any . The time complexities of our algorithms almost match the lower bound of Das Sarma et al. [STOC '11], thus leading to an answer to an open question raised by Elkin [SIGACT-News '04] and Das Sarma et al. [STOC '11]. To complement our upper bound results, we also strengthen the lower bound of Das Sarma et al. by extending it to unweighted graphs. We show that the same lower bound also holds for unweighted multigraphs (or equivalently for weighted graphs in which bits can be transmitted in each round over an edge of weight ). For unweighted simple graphs, we show that computing an -approximate minimum cut requires time at least .
View on arXiv