We investigate the trade-off between rate, privacy and storage in federated learning (FL) with top sparsification, where the users and the servers in the FL system only share the most significant and fractions, respectively, of updates and parameters in the FL process, to reduce the communication cost. We present schemes that guarantee information theoretic privacy of the values and indices of the sparse updates sent by the users at the expense of a larger storage cost. To this end, we generalize the scheme to reduce the storage cost by allowing a certain amount of information leakage. Thus, we provide the general trade-off between the communication cost, storage cost, and information leakage in private FL with top sparsification, along the lines of two proposed schemes.
View on arXiv