ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.06652
54
1

Adding Additional Control to One-Step Diffusion with Joint Distribution Matching

13 March 2025
Yihong Luo
Tianyang Hu
Yifan Song
Jiacheng Sun
Z. Li
Jing Tang
    DiffM
ArXivPDFHTML
Abstract

While diffusion distillation has enabled one-step generation through methods like Variational Score Distillation, adapting distilled models to emerging new controls -- such as novel structural constraints or latest user preferences -- remains challenging. Conventional approaches typically requires modifying the base diffusion model and redistilling it -- a process that is both computationally intensive and time-consuming. To address these challenges, we introduce Joint Distribution Matching (JDM), a novel approach that minimizes the reverse KL divergence between image-condition joint distributions. By deriving a tractable upper bound, JDM decouples fidelity learning from condition learning. This asymmetric distillation scheme enables our one-step student to handle controls unknown to the teacher model and facilitates improved classifier-free guidance (CFG) usage and seamless integration of human feedback learning (HFL). Experimental results demonstrate that JDM surpasses baseline methods such as multi-step ControlNet by mere one-step in most cases, while achieving state-of-the-art performance in one-step text-to-image synthesis through improved usage of CFG or HFL integration.

View on arXiv
@article{luo2025_2503.06652,
  title={ Adding Additional Control to One-Step Diffusion with Joint Distribution Matching },
  author={ Yihong Luo and Tianyang Hu and Yifan Song and Jiacheng Sun and Zhenguo Li and Jing Tang },
  journal={arXiv preprint arXiv:2503.06652},
  year={ 2025 }
}
Comments on this paper