64

On the Sample Complexity of Privately Learning Axis-Aligned Rectangles

Abstract

We revisit the fundamental problem of learning Axis-Aligned-Rectangles over a finite grid XdRdX^d\subseteq{\mathbb{R}}^d with differential privacy. Existing results show that the sample complexity of this problem is at most min{dlogX  ,  d1.5(logX)1.5}\min\left\{ d{\cdot}\log|X| \;,\; d^{1.5}{\cdot}\left(\log^*|X| \right)^{1.5}\right\}. That is, existing constructions either require sample complexity that grows linearly with logX\log|X|, or else it grows super linearly with the dimension dd. We present a novel algorithm that reduces the sample complexity to only O~{d(logX)1.5}\tilde{O}\left\{d{\cdot}\left(\log^*|X|\right)^{1.5}\right\}, attaining a dimensionality optimal dependency without requiring the sample complexity to grow with logX\log|X|.The technique used in order to attain this improvement involves the deletion of "exposed" data-points on the go, in a fashion designed to avoid the cost of the adaptive composition theorems. The core of this technique may be of individual interest, introducing a new method for constructing statistically-efficient private algorithms.

View on arXiv
Comments on this paper