ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.08949
  4. Cited By
Training GANs with Centripetal Acceleration

Training GANs with Centripetal Acceleration

24 February 2019
Wei Peng
Yuhong Dai
Hui Zhang
Lizhi Cheng
    GAN
ArXivPDFHTML

Papers citing "Training GANs with Centripetal Acceleration"

8 / 8 papers shown
Title
First Order Methods with Markovian Noise: from Acceleration to
  Variational Inequalities
First Order Methods with Markovian Noise: from Acceleration to Variational Inequalities
Aleksandr Beznosikov
S. Samsonov
Marina Sheshukova
Alexander Gasnikov
A. Naumov
Eric Moulines
34
14
0
25 May 2023
Similarity, Compression and Local Steps: Three Pillars of Efficient
  Communications for Distributed Variational Inequalities
Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities
Aleksandr Beznosikov
Martin Takáč
Alexander Gasnikov
29
10
0
15 Feb 2023
Accelerated Single-Call Methods for Constrained Min-Max Optimization
Accelerated Single-Call Methods for Constrained Min-Max Optimization
Yang Cai
Weiqiang Zheng
19
30
0
06 Oct 2022
On Scaled Methods for Saddle Point Problems
On Scaled Methods for Saddle Point Problems
Aleksandr Beznosikov
Aibek Alanov
D. Kovalev
Martin Takáč
Alexander Gasnikov
22
4
0
16 Jun 2022
Optimal Algorithms for Decentralized Stochastic Variational Inequalities
Optimal Algorithms for Decentralized Stochastic Variational Inequalities
D. Kovalev
Aleksandr Beznosikov
Abdurakhmon Sadiev
Michael Persiianov
Peter Richtárik
Alexander Gasnikov
35
34
0
06 Feb 2022
Training Generative Adversarial Networks with Adaptive Composite
  Gradient
Training Generative Adversarial Networks with Adaptive Composite Gradient
Huiqing Qi
Fang Li
Shengli Tan
Xiangyun Zhang
GAN
21
3
0
10 Nov 2021
The limits of min-max optimization algorithms: convergence to spurious
  non-critical sets
The limits of min-max optimization algorithms: convergence to spurious non-critical sets
Ya-Ping Hsieh
P. Mertikopoulos
V. Cevher
27
81
0
16 Jun 2020
ODE Analysis of Stochastic Gradient Methods with Optimism and Anchoring
  for Minimax Problems
ODE Analysis of Stochastic Gradient Methods with Optimism and Anchoring for Minimax Problems
Ernest K. Ryu
Kun Yuan
W. Yin
20
36
0
26 May 2019
1