Although robust learning and local differential privacy are both widely studied fields of research, combining the two settings is just starting to be explored. We consider the problem of estimating a discrete distribution in total variation from contaminated data batches under a local differential privacy constraint. A fraction of the batches contain i.i.d. samples drawn from a discrete distribution over elements. To protect the users' privacy, each of the samples is privatized using an -locally differentially private mechanism. The remaining batches are an adversarial contamination. The minimax rate of estimation under contamination alone, with no privacy, is known to be , up to a factor. Under the privacy constraint alone, the minimax rate of estimation is . We show that combining the two constraints leads to a minimax estimation rate of up to a factor, larger than the sum of the two separate rates. We provide a polynomial-time algorithm achieving this bound, as well as a matching information theoretic lower bound.
View on arXiv