ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.11217
  4. Cited By
Katyusha Acceleration for Convex Finite-Sum Compositional Optimization

Katyusha Acceleration for Convex Finite-Sum Compositional Optimization

24 October 2019
Yibo Xu
Yangyang Xu
ArXivPDFHTML

Papers citing "Katyusha Acceleration for Convex Finite-Sum Compositional Optimization"

4 / 4 papers shown
Title
Solving Stochastic Compositional Optimization is Nearly as Easy as
  Solving Stochastic Optimization
Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization
Tianyi Chen
Yuejiao Sun
W. Yin
48
81
0
25 Aug 2020
Momentum-based variance-reduced proximal stochastic gradient method for
  composite nonconvex stochastic optimization
Momentum-based variance-reduced proximal stochastic gradient method for composite nonconvex stochastic optimization
Yangyang Xu
Yibo Xu
35
23
0
31 May 2020
Stochastic Gauss-Newton Algorithms for Nonconvex Compositional
  Optimization
Stochastic Gauss-Newton Algorithms for Nonconvex Compositional Optimization
Quoc Tran-Dinh
Nhan H. Pham
Lam M. Nguyen
27
22
0
17 Feb 2020
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
93
737
0
19 Mar 2014
1