Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2307.13430
Cited By
Achieving Linear Speedup in Decentralized Stochastic Compositional Minimax Optimization
25 July 2023
Hongchang Gao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Achieving Linear Speedup in Decentralized Stochastic Compositional Minimax Optimization"
5 / 5 papers shown
Title
Decentralized Multi-Level Compositional Optimization Algorithms with Level-Independent Convergence Rate
Hongchang Gao
6
0
0
06 Jun 2023
GT-STORM: Taming Sample, Communication, and Memory Complexities in Decentralized Non-Convex Learning
Xin Zhang
Jia Liu
Zhengyuan Zhu
Elizabeth S. Bentley
19
14
0
04 May 2021
DecentLaM: Decentralized Momentum SGD for Large-batch Deep Training
Kun Yuan
Yiming Chen
Xinmeng Huang
Yingya Zhang
Pan Pan
Yinghui Xu
W. Yin
MoE
36
60
0
24 Apr 2021
Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization
Tianyi Chen
Yuejiao Sun
W. Yin
44
81
0
25 Aug 2020
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
234
11,568
0
09 Mar 2017
1