ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.15243
22
0

Single-loop Algorithms for Stochastic Non-convex Optimization with Weakly-Convex Constraints

21 April 2025
Ming Yang
Gang Li
Quanqi Hu
Qihang Lin
Tianbao Yang
ArXivPDFHTML
Abstract

Constrained optimization with multiple functional inequality constraints has significant applications in machine learning. This paper examines a crucial subset of such problems where both the objective and constraint functions are weakly convex. Existing methods often face limitations, including slow convergence rates or reliance on double-loop algorithmic designs. To overcome these challenges, we introduce a novel single-loop penalty-based stochastic algorithm. Following the classical exact penalty method, our approach employs a {\bf hinge-based penalty}, which permits the use of a constant penalty parameter, enabling us to achieve a {\bf state-of-the-art complexity} for finding an approximate Karush-Kuhn-Tucker (KKT) solution. We further extend our algorithm to address finite-sum coupled compositional objectives, which are prevalent in artificial intelligence applications, establishing improved complexity over existing approaches. Finally, we validate our method through experiments on fair learning with receiver operating characteristic (ROC) fairness constraints and continual learning with non-forgetting constraints.

View on arXiv
@article{yang2025_2504.15243,
  title={ Single-loop Algorithms for Stochastic Non-convex Optimization with Weakly-Convex Constraints },
  author={ Ming Yang and Gang Li and Quanqi Hu and Qihang Lin and Tianbao Yang },
  journal={arXiv preprint arXiv:2504.15243},
  year={ 2025 }
}
Comments on this paper