ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.08627
  4. Cited By
RECAPP: Crafting a More Efficient Catalyst for Convex Optimization

RECAPP: Crafting a More Efficient Catalyst for Convex Optimization

17 June 2022
Y. Carmon
A. Jambulapati
Yujia Jin
Aaron Sidford
ArXivPDFHTML

Papers citing "RECAPP: Crafting a More Efficient Catalyst for Convex Optimization"

7 / 7 papers shown
Title
Stabilized Proximal-Point Methods for Federated Optimization
Stabilized Proximal-Point Methods for Federated Optimization
Xiaowen Jiang
Anton Rodomanov
Sebastian U. Stich
FedML
44
1
0
09 Jul 2024
A Whole New Ball Game: A Primal Accelerated Method for Matrix Games and
  Minimizing the Maximum of Smooth Functions
A Whole New Ball Game: A Primal Accelerated Method for Matrix Games and Minimizing the Maximum of Smooth Functions
Y. Carmon
A. Jambulapati
Yujia Jin
Aaron Sidford
13
4
0
17 Nov 2023
Breaking the Lower Bound with (Little) Structure: Acceleration in
  Non-Convex Stochastic Optimization with Heavy-Tailed Noise
Breaking the Lower Bound with (Little) Structure: Acceleration in Non-Convex Stochastic Optimization with Heavy-Tailed Noise
Zijian Liu
Jiawei Zhang
Zhengyuan Zhou
14
12
0
14 Feb 2023
An Optimal Algorithm for Strongly Convex Min-min Optimization
An Optimal Algorithm for Strongly Convex Min-min Optimization
Alexander Gasnikov
D. Kovalev
Grigory Malinovsky
8
1
0
29 Dec 2022
Smooth Monotone Stochastic Variational Inequalities and Saddle Point
  Problems: A Survey
Smooth Monotone Stochastic Variational Inequalities and Saddle Point Problems: A Survey
Aleksandr Beznosikov
Boris Polyak
Eduard A. Gorbunov
D. Kovalev
Alexander Gasnikov
32
31
0
29 Aug 2022
The First Optimal Algorithm for Smooth and
  Strongly-Convex-Strongly-Concave Minimax Optimization
The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization
D. Kovalev
Alexander Gasnikov
10
15
0
11 May 2022
A Proximal Stochastic Gradient Method with Progressive Variance
  Reduction
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
Lin Xiao
Tong Zhang
ODL
76
736
0
19 Mar 2014
1