ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.06328
13
14

Momentum Aggregation for Private Non-convex ERM

12 October 2022
Hoang Tran
Ashok Cutkosky
ArXivPDFHTML
Abstract

We introduce new algorithms and convergence guarantees for privacy-preserving non-convex Empirical Risk Minimization (ERM) on smooth ddd-dimensional objectives. We develop an improved sensitivity analysis of stochastic gradient descent on smooth objectives that exploits the recurrence of examples in different epochs. By combining this new approach with recent analysis of momentum with private aggregation techniques, we provide an (ϵ,δ)(\epsilon,\delta)(ϵ,δ)-differential private algorithm that finds a gradient of norm O~(d1/3(ϵN)2/3)\tilde O\left(\frac{d^{1/3}}{(\epsilon N)^{2/3}}\right)O~((ϵN)2/3d1/3​) in O(N7/3ϵ4/3d2/3)O\left(\frac{N^{7/3}\epsilon^{4/3}}{d^{2/3}}\right)O(d2/3N7/3ϵ4/3​) gradient evaluations, improving the previous best gradient bound of O~(d1/4ϵN)\tilde O\left(\frac{d^{1/4}}{\sqrt{\epsilon N}}\right)O~(ϵN​d1/4​).

View on arXiv
Comments on this paper