ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1705.04138
10
33

Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization

11 May 2017
Yue Yu
Longbo Huang
ArXivPDFHTML
Abstract

We consider the stochastic composition optimization problem proposed in \cite{wang2017stochastic}, which has applications ranging from estimation to statistical and machine learning. We propose the first ADMM-based algorithm named com-SVR-ADMM, and show that com-SVR-ADMM converges linearly for strongly convex and Lipschitz smooth objectives, and has a convergence rate of O(log⁡S/S)O( \log S/S)O(logS/S), which improves upon the O(S−4/9)O(S^{-4/9})O(S−4/9) rate in \cite{wang2016accelerating} when the objective is convex and Lipschitz smooth. Moreover, com-SVR-ADMM possesses a rate of O(1/S)O(1/\sqrt{S})O(1/S​) when the objective is convex but without Lipschitz smoothness. We also conduct experiments and show that it outperforms existing algorithms.

View on arXiv
Comments on this paper