ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.19552
  4. Cited By
On scalable and efficient training of diffusion samplers
v1v2 (latest)

On scalable and efficient training of diffusion samplers

26 May 2025
Minkyu Kim
Kiyoung Seong
Dongyeop Woo
SungSoo Ahn
Minsu Kim
    DiffM
ArXiv (abs)PDFHTML

Papers citing "On scalable and efficient training of diffusion samplers"

4 / 4 papers shown
Title
From discrete-time policies to continuous-time diffusion samplers: Asymptotic equivalences and faster training
From discrete-time policies to continuous-time diffusion samplers: Asymptotic equivalences and faster training
Julius Berner
Lorenz Richter
Marcin Sendera
Jarrid Rector-Brooks
Nikolay Malkin
OffRL
157
8
0
10 Jan 2025
NETS: A Non-Equilibrium Transport Sampler
NETS: A Non-Equilibrium Transport Sampler
M. S. Albergo
Eric Vanden-Eijnden
DiffM
136
22
0
03 Oct 2024
Adaptive teachers for amortized samplers
Adaptive teachers for amortized samplers
Minsu Kim
Sanghyeok Choi
Taeyoung Yun
Emmanuel Bengio
Leo Feng
Jarrid Rector-Brooks
Sungsoo Ahn
Jinkyoo Park
Nikolay Malkin
Yoshua Bengio
483
7
0
02 Oct 2024
Improved off-policy training of diffusion samplers
Improved off-policy training of diffusion samplers
Marcin Sendera
Minsu Kim
Sarthak Mittal
Pablo Lemos
Luca Scimeca
Jarrid Rector-Brooks
Alexandre Adam
Yoshua Bengio
Nikolay Malkin
OffRL
224
28
0
07 Feb 2024
1