Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.04719
Cited By
ResKD: Residual-Guided Knowledge Distillation
8 June 2020
Xuewei Li
Songyuan Li
Bourahla Omar
Fei Wu
Xi Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ResKD: Residual-Guided Knowledge Distillation"
8 / 8 papers shown
Title
Cross-Modal and Uncertainty-Aware Agglomeration for Open-Vocabulary 3D Scene Understanding
Jinlong Li
Cristiano Saltori
Fabio Poiesi
N. Sebe
69
0
0
20 Mar 2025
Development of Skip Connection in Deep Neural Networks for Computer Vision and Medical Image Analysis: A Survey
Guoping Xu
Xiaxia Wang
Xinglong Wu
Xuesong Leng
Yongchao Xu
3DPC
27
8
0
02 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
46
1
0
22 Apr 2024
ConaCLIP: Exploring Distillation of Fully-Connected Knowledge Interaction Graph for Lightweight Text-Image Retrieval
Jiapeng Wang
Chengyu Wang
Xiaodan Wang
Jun Huang
Lianwen Jin
VLM
26
4
0
28 May 2023
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
41
2
0
15 Mar 2023
Hilbert Distillation for Cross-Dimensionality Networks
Dian Qin
Haishuai Wang
Zhe Liu
Hongjia Xu
Sheng Zhou
Jiajun Bu
19
4
0
08 Nov 2022
Fixing the train-test resolution discrepancy: FixEfficientNet
Hugo Touvron
Andrea Vedaldi
Matthijs Douze
Hervé Jégou
AAML
176
110
0
18 Mar 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
1