ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.11228
14
4

From Adaptive Query Release to Machine Unlearning

20 July 2023
Enayat Ullah
R. Arora
    MU
ArXivPDFHTML
Abstract

We formalize the problem of machine unlearning as design of efficient unlearning algorithms corresponding to learning algorithms which perform a selection of adaptive queries from structured query classes. We give efficient unlearning algorithms for linear and prefix-sum query classes. As applications, we show that unlearning in many problems, in particular, stochastic convex optimization (SCO), can be reduced to the above, yielding improved guarantees for the problem. In particular, for smooth Lipschitz losses and any ρ>0\rho>0ρ>0, our results yield an unlearning algorithm with excess population risk of O~(1n+dnρ)\tilde O\big(\frac{1}{\sqrt{n}}+\frac{\sqrt{d}}{n\rho}\big)O~(n​1​+nρd​​) with unlearning query (gradient) complexity O~(ρ⋅Retraining Complexity)\tilde O(\rho \cdot \text{Retraining Complexity})O~(ρ⋅Retraining Complexity), where ddd is the model dimensionality and nnn is the initial number of samples. For non-smooth Lipschitz losses, we give an unlearning algorithm with excess population risk O~(1n+(dnρ)1/2)\tilde O\big(\frac{1}{\sqrt{n}}+\big(\frac{\sqrt{d}}{n\rho}\big)^{1/2}\big)O~(n​1​+(nρd​​)1/2) with the same unlearning query (gradient) complexity. Furthermore, in the special case of Generalized Linear Models (GLMs), such as those in linear and logistic regression, we get dimension-independent rates of O~(1n+1(nρ)2/3)\tilde O\big(\frac{1}{\sqrt{n}} +\frac{1}{(n\rho)^{2/3}}\big)O~(n​1​+(nρ)2/31​) and O~(1n+1(nρ)1/3)\tilde O\big(\frac{1}{\sqrt{n}} +\frac{1}{(n\rho)^{1/3}}\big)O~(n​1​+(nρ)1/31​) for smooth Lipschitz and non-smooth Lipschitz losses respectively. Finally, we give generalizations of the above from one unlearning request to \textit{dynamic} streams consisting of insertions and deletions.

View on arXiv
Comments on this paper