ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.02388
  4. Cited By
Knowledge Distillation in Iterative Generative Models for Improved
  Sampling Speed

Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed

7 January 2021
Eric Luhman
Troy Luhman
    DiffM
ArXivPDFHTML

Papers citing "Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed"

3 / 3 papers shown
Title
SoccerDiffusion: Toward Learning End-to-End Humanoid Robot Soccer from Gameplay Recordings
SoccerDiffusion: Toward Learning End-to-End Humanoid Robot Soccer from Gameplay Recordings
Florian Vahl
Jörn Griepenburg
Jan Gutsche
Jasper Güldenstein
Jianwei Zhang
VGen
26
0
0
29 Apr 2025
Integration Flow Models
Integration Flow Models
Jingjing Wang
Dan Zhang
Joshua Luo
Yin Yang
Feng Luo
14
0
0
28 Apr 2025
Fast Autoregressive Models for Continuous Latent Generation
Fast Autoregressive Models for Continuous Latent Generation
Tiankai Hang
Jianmin Bao
Fangyun Wei
Dong Chen
DiffM
53
0
0
24 Apr 2025
1