Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.03744
Cited By
ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs
6 June 2024
Fang Chen
Gourav Datta
Mujahid Al Rafi
Hyeran Jeon
Meng Tang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs"
8 / 8 papers shown
Title
One-step Diffusion with Distribution Matching Distillation
Tianwei Yin
Michael Gharbi
Richard Zhang
Eli Shechtman
Frédo Durand
William T. Freeman
Taesung Park
DiffM
91
65
0
30 Nov 2023
Adversarial Diffusion Distillation
Axel Sauer
Dominik Lorenz
A. Blattmann
Robin Rombach
114
138
0
28 Nov 2023
AutoDiffusion: Training-Free Optimization of Time Steps and Architectures for Automated Diffusion Model Acceleration
Lijiang Li
Huixia Li
Xiawu Zheng
Jie Wu
Xuefeng Xiao
Rui Wang
Min Zheng
Xin Pan
Fei Chao
R. Ji
45
19
0
19 Sep 2023
Self-Attentive Pooling for Efficient Deep Learning
Fang Chen
Gourav Datta
Souvik Kundu
P. Beerel
31
5
0
16 Sep 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
120
308
0
19 Apr 2021
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
239
6,278
0
16 Nov 2016
U-Net: Convolutional Networks for Biomedical Image Segmentation
Olaf Ronneberger
Philipp Fischer
Thomas Brox
SSeg
3DV
200
9,999
0
18 May 2015
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
260
9,997
0
01 Sep 2014
1