Advances and Open Problems in Federated Learning
Peter Kairouz
H. B. McMahan
Brendan Avent
A. Bellet
M. Bennis
A. Bhagoji
Keith Bonawitz
Zachary B. Charles
Graham Cormode
Rachel Cummings
Rafael G. L. DÓliveira
Hubert Eichner
S. E. Rouayheb
David E. Evans
Josh Gardner
Zachary Garrett
Adria Gascon
Badih Ghazi
Phillip B. Gibbons
Marco Gruteser
Zaïd Harchaoui
Chaoyang He
Lie He
Zhouyuan Huo
Ben Hutchinson
Justin Hsu
Martin Jaggi
T. Javidi
Gauri Joshi
M. Khodak
Jakub Konecný
Aleksandra Korolova
F. Koushanfar
Oluwasanmi Koyejo
Tancrède Lepoint
Yang Liu
Prateek Mittal
M. Mohri
Richard Nock
A. Özgür
Rasmus Pagh
Mariana Raykova
Hang Qi
Daniel Ramage
Ramesh Raskar
D. Song
Weikang Song
Sebastian U. Stich
Ziteng Sun
A. Suresh
Florian Tramèr
Praneeth Vepakomma
Jianyu Wang
Li Xiong
Zheng Xu
Qiang Yang
Felix X. Yu
Han Yu
Sen Zhao

Abstract
Federated learning (FL) is a machine learning setting where many clients (e.g. mobile devices or whole organizations) collaboratively train a model under the orchestration of a central server (e.g. service provider), while keeping the training data decentralized. FL embodies the principles of focused data collection and minimization, and can mitigate many of the systemic privacy risks and costs resulting from traditional, centralized machine learning and data science approaches. Motivated by the explosive growth in FL research, this paper discusses recent advances and presents an extensive collection of open problems and challenges.
View on arXivComments on this paper