ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.07290
11
22

Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization

17 February 2020
Quoc Tran-Dinh
Nhan H. Pham
Lam M. Nguyen
ArXivPDFHTML
Abstract

We develop two new stochastic Gauss-Newton algorithms for solving a class of non-convex stochastic compositional optimization problems frequently arising in practice. We consider both the expectation and finite-sum settings under standard assumptions, and use both classical stochastic and SARAH estimators for approximating function values and Jacobians. In the expectation case, we establish O(ε−2)\mathcal{O}(\varepsilon^{-2})O(ε−2) iteration-complexity to achieve a stationary point in expectation and estimate the total number of stochastic oracle calls for both function value and its Jacobian, where ε\varepsilonε is a desired accuracy. In the finite sum case, we also estimate O(ε−2)\mathcal{O}(\varepsilon^{-2})O(ε−2) iteration-complexity and the total oracle calls with high probability. To our best knowledge, this is the first time such global stochastic oracle complexity is established for stochastic Gauss-Newton methods. Finally, we illustrate our theoretical results via two numerical examples on both synthetic and real datasets.

View on arXiv
Comments on this paper