Trade-off in Estimating the Number of Byzantine Clients in Federated Learning
- FedML

Federated learning has attracted increasing attention at recent large-scale optimization and machine learning research and applications, but is also vulnerable to Byzantine clients that can send any erroneous signals. Robust aggregators are commonly used to resist Byzantine clients. This usually requires to estimate the unknown number of Byzantine clients, and thus accordingly select the aggregators with proper degree of robustness (i.e., the maximum number of Byzantine clients allowed by the aggregator). Such an estimation should have important effect on the performance, which has not been systematically studied to our knowledge. This work will fill in the gap by theoretically analyzing the worst-case error of aggregators as well as its induced federated learning algorithm for any cases of and . Specifically, we will show that underestimation () can lead to arbitrarily poor performance for both aggregators and federated learning. For non-underestimation (), we have proved optimal lower and upper bounds of the same order on the errors of both aggregators and federated learning. All these optimal bounds are proportional to with clients, which monotonically increases with larger . This indicates a fundamental trade-off: while an aggregator with a larger robustness degree can solve federated learning problems of wider range , the performance can deteriorate when there are actually fewer or even no Byzantine clients (i.e., ).
View on arXiv