Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1907.09682
Cited By
Similarity-Preserving Knowledge Distillation
23 July 2019
Frederick Tung
Greg Mori
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Similarity-Preserving Knowledge Distillation"
50 / 122 papers shown
Title
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
49
0
0
27 Apr 2025
Moss: Proxy Model-based Full-Weight Aggregation in Federated Learning with Heterogeneous Models
Y. Cai
Ziqi Zhang
Ding Li
Yao Guo
Xiangqun Chen
46
0
0
13 Mar 2025
Semantic-Supervised Spatial-Temporal Fusion for LiDAR-based 3D Object Detection
Chaoqun Wang
Xiaobin Hong
Wenzhong Li
Ruimao Zhang
3DPC
115
0
0
13 Mar 2025
FEDS: Feature and Entropy-Based Distillation Strategy for Efficient Learned Image Compression
H. Fu
Jie Liang
Zhenman Fang
Jingning Han
36
0
0
09 Mar 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
71
0
0
28 Feb 2025
Variational Bayesian Adaptive Learning of Deep Latent Variables for Acoustic Knowledge Transfer
Hu Hu
Sabato Marco Siniscalchi
Chao-Han Huck Yang
Chin-Hui Lee
65
0
0
28 Jan 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
42
0
0
13 Jan 2025
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
33
0
0
06 Jan 2025
Budgeted Online Continual Learning by Adaptive Layer Freezing and Frequency-based Sampling
Minhyuk Seo
Hyunseo Koh
Jonghyun Choi
29
1
0
19 Oct 2024
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
34
0
0
30 Sep 2024
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Chaomin Shen
Faming Fang
Guixu Zhang
24
0
0
27 Sep 2024
Relational Representation Distillation
Nikolaos Giakoumoglou
Tania Stathaki
22
0
0
16 Jul 2024
Leveraging Topological Guidance for Improved Knowledge Distillation
Eun Som Jeon
Rahul Khurana
Aishani Pathak
P. Turaga
44
0
0
07 Jul 2024
Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data
Eun Som Jeon
Hongjun Choi
A. Shukla
Yuan Wang
Hyunglae Lee
M. Buman
P. Turaga
22
3
0
07 Jul 2024
Instance Temperature Knowledge Distillation
Zhengbo Zhang
Yuxi Zhou
Jia Gong
Jun Liu
Zhigang Tu
19
2
0
27 Jun 2024
Self-Supervised Representation Learning with Spatial-Temporal Consistency for Sign Language Recognition
Weichao Zhao
Wengang Zhou
Hezhen Hu
Min Wang
Houqiang Li
SLR
35
2
0
15 Jun 2024
ReDistill: Residual Encoded Distillation for Peak Memory Reduction of CNNs
Fang Chen
Gourav Datta
Mujahid Al Rafi
Hyeran Jeon
Meng Tang
91
1
0
06 Jun 2024
AdaQAT: Adaptive Bit-Width Quantization-Aware Training
Cédric Gernigon
Silviu-Ioan Filip
Olivier Sentieys
Clément Coggiola
Mickael Bruno
23
2
0
22 Apr 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
56
1
0
22 Apr 2024
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
28
0
0
08 Mar 2024
Iterative Data Smoothing: Mitigating Reward Overfitting and Overoptimization in RLHF
Banghua Zhu
Michael I. Jordan
Jiantao Jiao
21
23
0
29 Jan 2024
Revisiting Knowledge Distillation under Distribution Shift
Songming Zhang
Ziyu Lyu
Xiaofeng Chen
24
1
0
25 Dec 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
19
0
0
26 Oct 2023
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi-Long Han
Yanmei Wang
Xi’ai Chen
Baicheng Liu
Yandong Tang
44
1
0
03 Jul 2023
Cross Architecture Distillation for Face Recognition
Weisong Zhao
Xiangyu Zhu
Zhixiang He
Xiaoyu Zhang
Zhen Lei
CVBM
13
6
0
26 Jun 2023
CORSD: Class-Oriented Relational Self Distillation
Muzhou Yu
S. Tan
Kailu Wu
Runpei Dong
Linfeng Zhang
Kaisheng Ma
8
0
0
28 Apr 2023
Grouped Knowledge Distillation for Deep Face Recognition
Weisong Zhao
Xiangyu Zhu
Kaiwen Guo
Xiaoyu Zhang
Zhen Lei
CVBM
13
6
0
10 Apr 2023
Mixed-Type Wafer Classification For Low Memory Devices Using Knowledge Distillation
Nitish Shukla
Anurima Dey
K. Srivatsan
19
1
0
24 Mar 2023
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
24
8
0
27 Feb 2023
Two-in-one Knowledge Distillation for Efficient Facial Forgery Detection
Chu Zhou
Jiajun Huang
Daochang Liu
Chengbin Du
Siqi Ma
Surya Nepal
Chang Xu
18
0
0
21 Feb 2023
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
17
12
0
28 Jan 2023
TiG-BEV: Multi-view BEV 3D Object Detection via Target Inner-Geometry Learning
Pei-Kai Huang
L. Liu
Renrui Zhang
Song Zhang
Xin Xu
Bai-Qi Wang
G. Liu
3DPC
MDE
30
42
0
28 Dec 2022
OVO: One-shot Vision Transformer Search with Online distillation
Zimian Wei
H. Pan
Xin-Yi Niu
Dongsheng Li
ViT
16
1
0
28 Dec 2022
BD-KD: Balancing the Divergences for Online Knowledge Distillation
Ibtihel Amara
N. Sepahvand
B. Meyer
W. Gross
J. Clark
10
2
0
25 Dec 2022
3D Point Cloud Pre-training with Knowledge Distillation from 2D Images
Yuan Yao
Yuanhan Zhang
Zhen-fei Yin
Jiebo Luo
Wanli Ouyang
Xiaoshui Huang
3DPC
27
10
0
17 Dec 2022
Accelerating Dataset Distillation via Model Augmentation
Lei Zhang
Jie M. Zhang
Bowen Lei
Subhabrata Mukherjee
Xiang Pan
Bo-Lu Zhao
Caiwen Ding
Y. Li
Dongkuan Xu
DD
21
62
0
12 Dec 2022
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
22
132
0
29 Nov 2022
D
3
^3
3
ETR: Decoder Distillation for Detection Transformer
Xiaokang Chen
Jiahui Chen
Y. Liu
Gang Zeng
34
16
0
17 Nov 2022
Structured Knowledge Distillation Towards Efficient and Compact Multi-View 3D Detection
Linfeng Zhang
Yukang Shi
Hung-Shuo Tai
Zhipeng Zhang
Yuan He
Ke Wang
Kaisheng Ma
18
2
0
14 Nov 2022
Hilbert Distillation for Cross-Dimensionality Networks
Dian Qin
Haishuai Wang
Zhe Liu
Hongjia Xu
Sheng Zhou
Jiajun Bu
19
4
0
08 Nov 2022
Multimodal Transformer Distillation for Audio-Visual Synchronization
Xuan-Bo Chen
Haibin Wu
Chung-Che Wang
Hung-yi Lee
J. Jang
24
3
0
27 Oct 2022
COST-EFF: Collaborative Optimization of Spatial and Temporal Efficiency with Slenderized Multi-exit Language Models
Bowen Shen
Zheng Lin
Yuanxin Liu
Zhengxiao Liu
Lei Wang
Weiping Wang
VLM
33
4
0
27 Oct 2022
Improved Feature Distillation via Projector Ensemble
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
19
37
0
27 Oct 2022
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
19
23
0
23 Oct 2022
Few-Shot Learning of Compact Models via Task-Specific Meta Distillation
Yong Wu
Shekhor Chanda
M. Hosseinzadeh
Zhi Liu
Yang Wang
VLM
27
7
0
18 Oct 2022
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
Kien Do
Hung Le
D. Nguyen
Dang Nguyen
Haripriya Harikumar
T. Tran
Santu Rana
Svetha Venkatesh
13
32
0
21 Sep 2022
Disentangle and Remerge: Interventional Knowledge Distillation for Few-Shot Object Detection from A Conditional Causal Perspective
Jiangmeng Li
Yanan Zhang
Wenwen Qiang
Lingyu Si
Chengbo Jiao
Xiaohui Hu
Changwen Zheng
Fuchun Sun
CML
21
28
0
26 Aug 2022
CMD: Self-supervised 3D Action Representation Learning with Cross-modal Mutual Distillation
Yunyao Mao
Wen-gang Zhou
Zhenbo Lu
Jiajun Deng
Houqiang Li
25
38
0
26 Aug 2022
Multi-domain Learning for Updating Face Anti-spoofing Models
Xiao Guo
Yaojie Liu
Anil Jain
Xiaoming Liu
CLL
CVBM
16
30
0
23 Aug 2022
GCISG: Guided Causal Invariant Learning for Improved Syn-to-real Generalization
Gilhyun Nam
Gyeongjae Choi
Kyungmin Lee
OOD
15
4
0
22 Aug 2022
1
2
3
Next