ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.16397
41
0

Scale-wise Distillation of Diffusion Models

20 March 2025
Nikita Starodubcev
Denis Kuznedelev
Artem Babenko
Dmitry Baranchuk
    DiffM
ArXivPDFHTML
Abstract

We present SwD, a scale-wise distillation framework for diffusion models (DMs), which effectively employs next-scale prediction ideas for diffusion-based few-step generators. In more detail, SwD is inspired by the recent insights relating diffusion processes to the implicit spectral autoregression. We suppose that DMs can initiate generation at lower data resolutions and gradually upscale the samples at each denoising step without loss in performance while significantly reducing computational costs. SwD naturally integrates this idea into existing diffusion distillation methods based on distribution matching. Also, we enrich the family of distribution matching approaches by introducing a novel patch loss enforcing finer-grained similarity to the target distribution. When applied to state-of-the-art text-to-image diffusion models, SwD approaches the inference times of two full resolution steps and significantly outperforms the counterparts under the same computation budget, as evidenced by automated metrics and human preference studies.

View on arXiv
@article{starodubcev2025_2503.16397,
  title={ Scale-wise Distillation of Diffusion Models },
  author={ Nikita Starodubcev and Denis Kuznedelev and Artem Babenko and Dmitry Baranchuk },
  journal={arXiv preprint arXiv:2503.16397},
  year={ 2025 }
}
Comments on this paper