ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.04319
  4. Cited By
Dissecting adaptive methods in GANs

Dissecting adaptive methods in GANs

9 October 2022
Samy Jelassi
David Dobre
A. Mensch
Yuanzhi Li
Gauthier Gidel
ArXiv (abs)PDFHTML

Papers citing "Dissecting adaptive methods in GANs"

3 / 3 papers shown
Title
Seesaw: Accelerating Training by Balancing Learning Rate and Batch Size Scheduling
Seesaw: Accelerating Training by Balancing Learning Rate and Batch Size Scheduling
Alexandru Meterez
Depen Morwani
Jingfeng Wu
Costin-Andrei Oncescu
Cengiz Pehlevan
Sham Kakade
LRM
96
1
0
16 Oct 2025
DoWG Unleashed: An Efficient Universal Parameter-Free Gradient Descent
  Method
DoWG Unleashed: An Efficient Universal Parameter-Free Gradient Descent MethodNeural Information Processing Systems (NeurIPS), 2023
Ahmed Khaled
Konstantin Mishchenko
Chi Jin
ODL
290
38
0
25 May 2023
Unifying GANs and Score-Based Diffusion as Generative Particle Models
Unifying GANs and Score-Based Diffusion as Generative Particle ModelsNeural Information Processing Systems (NeurIPS), 2023
Jean-Yves Franceschi
Mike Gartrell
Ludovic Dos Santos
Thibaut Issenhuth
Emmanuel de Bezenac
Mickaël Chen
A. Rakotomamonjy
DiffM
310
28
0
25 May 2023
1