Efficient Cross-Device Federated Learning Algorithms for Minimax
Problems
- FedML
In many machine learning applications where massive and privacy-sensitive data are generated on numerous mobile or IoT devices, collecting data in a centralized location may be prohibitive. Thus, it is increasingly attractive to estimate parameters over mobile or IoT devices while keeping data localized. Such learning setting is known as cross-device federated learning. In this paper, we propose the first theoretically guaranteed algorithms for general minimax problems in the cross-device federated learning setting. Our algorithms require only a fraction of devices in each round of training, which overcomes the difficulty introduced by the low availability of devices. The communication overhead is further reduced by performing multiple local update steps on clients before communication with the server, and global gradient estimates are leveraged to correct the bias in local update directions introduced by data heterogeneity. By developing analyses based on novel potential functions, we establish theoretical convergence guarantees for our algorithms. Experimental results on AUC maximization, robust adversarial network training, and GAN training tasks demonstrate the efficiency of our algorithms.
View on arXiv