ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.13358
53
0

One-Step Residual Shifting Diffusion for Image Super-Resolution via Distillation

17 March 2025
Daniil Selikhanovych
David Li
Aleksei Leonov
Nikita Gushchin
Sergei Kushneriuk
Alexander N. Filippov
E. Burnaev
Iaroslav Koshelev
Alexander Korotin
    DiffM
ArXivPDFHTML
Abstract

Diffusion models for super-resolution (SR) produce high-quality visual results but require expensive computational costs. Despite the development of several methods to accelerate diffusion-based SR models, some (e.g., SinSR) fail to produce realistic perceptual details, while others (e.g., OSEDiff) may hallucinate non-existent structures. To overcome these issues, we present RSD, a new distillation method for ResShift, one of the top diffusion-based SR models. Our method is based on training the student network to produce such images that a new fake ResShift model trained on them will coincide with the teacher model. RSD achieves single-step restoration and outperforms the teacher by a large margin. We show that our distillation method can surpass the other distillation-based method for ResShift - SinSR - making it on par with state-of-the-art diffusion-based SR distillation methods. Compared to SR methods based on pre-trained text-to-image models, RSD produces competitive perceptual quality, provides images with better alignment to degraded input images, and requires fewer parameters and GPU memory. We provide experimental results on various real-world and synthetic datasets, including RealSR, RealSet65, DRealSR, ImageNet, and DIV2K.

View on arXiv
@article{selikhanovych2025_2503.13358,
  title={ One-Step Residual Shifting Diffusion for Image Super-Resolution via Distillation },
  author={ Daniil Selikhanovych and David Li and Aleksei Leonov and Nikita Gushchin and Sergei Kushneriuk and Alexander Filippov and Evgeny Burnaev and Iaroslav Koshelev and Alexander Korotin },
  journal={arXiv preprint arXiv:2503.13358},
  year={ 2025 }
}
Comments on this paper