ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.19517
44
2

Inexact subgradient methods for semialgebraic functions

30 April 2024
Jérôme Bolte
Tam Le
Éric Moulines
Edouard Pauwels
ArXivPDFHTML
Abstract

Motivated by the extensive application of approximate gradients in machine learning and optimization, we investigate inexact subgradient methods subject to persistent additive errors. Within a nonconvex semialgebraic framework, assuming boundedness or coercivity, we establish that the method yields iterates that eventually fluctuate near the critical set at a proximity characterized by an O(ϵρ)O(\epsilon^\rho)O(ϵρ) distance, where ϵ\epsilonϵ denotes the magnitude of subgradient evaluation errors, and ρ\rhoρ encapsulates geometric characteristics of the underlying problem. Our analysis comprehensively addresses both vanishing and constant step-size regimes. Notably, the latter regime inherently enlarges the fluctuation region, yet this enlargement remains on the order of ϵρ\epsilon^\rhoϵρ. In the convex scenario, employing a universal error bound applicable to coercive semialgebraic functions, we derive novel complexity results concerning averaged iterates. Additionally, our study produces auxiliary results of independent interest, including descent-type lemmas for nonsmooth nonconvex functions and an invariance principle governing the behavior of algorithmic sequences under small-step limits.

View on arXiv
@article{bolte2025_2404.19517,
  title={ Inexact subgradient methods for semialgebraic functions },
  author={ Jérôme Bolte and Tam Le and Éric Moulines and Edouard Pauwels },
  journal={arXiv preprint arXiv:2404.19517},
  year={ 2025 }
}
Comments on this paper