39
23

Adapting to Function Difficulty and Growth Conditions in Private Optimization

Abstract

We develop algorithms for private stochastic convex optimization that adapt to the hardness of the specific function we wish to optimize. While previous work provide worst-case bounds for arbitrary convex functions, it is often the case that the function at hand belongs to a smaller class that enjoys faster rates. Concretely, we show that for functions exhibiting κ\kappa-growth around the optimum, i.e., f(x)f(x)+λκ1xx2κf(x) \ge f(x^*) + \lambda \kappa^{-1} \|x-x^*\|_2^\kappa for κ>1\kappa > 1, our algorithms improve upon the standard d/nε{\sqrt{d}}/{n\varepsilon} privacy rate to the faster (d/nε)κκ1({\sqrt{d}}/{n\varepsilon})^{\tfrac{\kappa}{\kappa - 1}}. Crucially, they achieve these rates without knowledge of the growth constant κ\kappa of the function. Our algorithms build upon the inverse sensitivity mechanism, which adapts to instance difficulty (Asi & Duchi, 2020), and recent localization techniques in private optimization (Feldman et al., 2020). We complement our algorithms with matching lower bounds for these function classes and demonstrate that our adaptive algorithm is \emph{simultaneously} (minimax) optimal over all κ1+c\kappa \ge 1+c whenever c=Θ(1)c = \Theta(1).

View on arXiv
Comments on this paper