Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2101.02388
Cited By
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
7 January 2021
Eric Luhman
Troy Luhman
DiffM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed"
4 / 4 papers shown
Title
Distilling Two-Timed Flow Models by Separately Matching Initial and Terminal Velocities
Pramook Khungurn
Pratch Piyawongwisal
Sira Sriswadi
Supasorn Suwajanakorn
10
0
0
02 May 2025
SoccerDiffusion: Toward Learning End-to-End Humanoid Robot Soccer from Gameplay Recordings
Florian Vahl
Jörn Griepenburg
Jan Gutsche
Jasper Güldenstein
Jianwei Zhang
VGen
26
0
0
29 Apr 2025
Integration Flow Models
Jingjing Wang
Dan Zhang
Joshua Luo
Yin Yang
Feng Luo
16
0
0
28 Apr 2025
Fast Autoregressive Models for Continuous Latent Generation
Tiankai Hang
Jianmin Bao
Fangyun Wei
Dong Chen
DiffM
60
0
0
24 Apr 2025
1