ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.04952
  4. Cited By
Entropy-based Training Methods for Scalable Neural Implicit Sampler

Entropy-based Training Methods for Scalable Neural Implicit Sampler

8 June 2023
Weijian Luo
Boya Zhang
Zhihua Zhang
ArXivPDFHTML

Papers citing "Entropy-based Training Methods for Scalable Neural Implicit Sampler"

5 / 5 papers shown
Title
Schedule On the Fly: Diffusion Time Prediction for Faster and Better Image Generation
Schedule On the Fly: Diffusion Time Prediction for Faster and Better Image Generation
Zilyu Ye
Zhiyang Chen
Tiancheng Li
Zemin Huang
Weijian Luo
Guo-jun Qi
DiffM
72
4
0
02 Dec 2024
Training Neural Samplers with Reverse Diffusive KL Divergence
Training Neural Samplers with Reverse Diffusive KL Divergence
Jiajun He
Wenlin Chen
Mingtian Zhang
David Barber
José Miguel Hernández-Lobato
DiffM
26
4
0
16 Oct 2024
Make-An-Audio: Text-To-Audio Generation with Prompt-Enhanced Diffusion
  Models
Make-An-Audio: Text-To-Audio Generation with Prompt-Enhanced Diffusion Models
Rongjie Huang
Jia-Bin Huang
Dongchao Yang
Yi Ren
Luping Liu
Mingze Li
Zhenhui Ye
Jinglin Liu
Xiaoyue Yin
Zhou Zhao
DiffM
140
304
0
30 Jan 2023
Autoregressive Score Matching
Autoregressive Score Matching
Chenlin Meng
Lantao Yu
Yang Song
Jiaming Song
Stefano Ermon
DiffM
172
12
0
24 Oct 2020
MCMC using Hamiltonian dynamics
MCMC using Hamiltonian dynamics
Radford M. Neal
130
3,260
0
09 Jun 2012
1