Communication-Efficient Distributed SVD via Local Power Iterations
International Conference on Machine Learning (ICML), 2020
Abstract
We study the distributed computing of the truncated singular value decomposition (SVD). We develop an algorithm that we call \texttt{LocalPower} for improving the communication efficiency. Specifically, we uniformly partition the dataset among nodes and alternate between multiple (precisely ) local power iterations and one global aggregation. We theoretically show that under certain assumptions, \texttt{LocalPower} lowers the required number of communications by a factor of to reach a certain accuracy. We also show that the strategy of periodically decaying helps improve the performance of \texttt{LocalPower}. We conduct experiments to demonstrate the effectiveness of \texttt{LocalPower}.
View on arXivComments on this paper
