271

vqSGD: Vector Quantized Stochastic Gradient Descent

Abstract

In this work, we present a family of vector quantization schemes vqSGD (Vector-Quantized Stochastic Gradient Descent), that provide an asymptotic reduction in the communication cost with convergence guarantees in distributed optimization. In particular, we consider a randomized scheme based on the convex hull of a point set, that returns an unbiased estimator of a d-dimensional gradient vector with bounded variance. We provide multiple efficient instances of our scheme that require only o(d) bits of communication at the expense of a reasonable increase in variance. The instances of our quantization scheme are obtained using the properties of binary error-correcting codes and provide a smooth tradeoff between the communication and the variance of quantization. Furthermore, we show that vqSGD also offers strong privacy guarantees.

View on arXiv
Comments on this paper