ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.09682
  4. Cited By
Similarity-Preserving Knowledge Distillation

Similarity-Preserving Knowledge Distillation

23 July 2019
Frederick Tung
Greg Mori
ArXivPDFHTML

Papers citing "Similarity-Preserving Knowledge Distillation"

50 / 122 papers shown
Title
NOSMOG: Learning Noise-robust and Structure-aware MLPs on Graphs
NOSMOG: Learning Noise-robust and Structure-aware MLPs on Graphs
Yijun Tian
Chuxu Zhang
Zhichun Guo
Xiangliang Zhang
Nitesh V. Chawla
29
14
0
22 Aug 2022
Enhancing Heterogeneous Federated Learning with Knowledge Extraction and
  Multi-Model Fusion
Enhancing Heterogeneous Federated Learning with Knowledge Extraction and Multi-Model Fusion
Duy Phuong Nguyen
Sixing Yu
J. P. Muñoz
Ali Jannesari
FedML
11
12
0
16 Aug 2022
Task-Balanced Distillation for Object Detection
Task-Balanced Distillation for Object Detection
Ruining Tang
Zhen-yu Liu
Yangguang Li
Yiguo Song
Hui Liu
Qide Wang
Jing Shao
Guifang Duan
Jianrong Tan
19
20
0
05 Aug 2022
Overlooked Poses Actually Make Sense: Distilling Privileged Knowledge
  for Human Motion Prediction
Overlooked Poses Actually Make Sense: Distilling Privileged Knowledge for Human Motion Prediction
Xiaoning Sun
Qiongjie Cui
Huaijiang Sun
Bin Li
Weiqing Li
Jianfeng Lu
18
7
0
02 Aug 2022
Locality Guidance for Improving Vision Transformers on Tiny Datasets
Locality Guidance for Improving Vision Transformers on Tiny Datasets
Kehan Li
Runyi Yu
Zhennan Wang
Li-ming Yuan
Guoli Song
Jie Chen
ViT
17
43
0
20 Jul 2022
Deep Semantic Statistics Matching (D2SM) Denoising Network
Deep Semantic Statistics Matching (D2SM) Denoising Network
Kangfu Mei
Vishal M. Patel
Rui Huang
DiffM
13
8
0
19 Jul 2022
Bridging the Gap between Object and Image-level Representations for
  Open-Vocabulary Detection
Bridging the Gap between Object and Image-level Representations for Open-Vocabulary Detection
H. Rasheed
Muhammad Maaz
Muhammad Uzair Khattak
Salman Khan
F. Khan
ObjD
VLM
23
151
0
07 Jul 2022
Boosting Single-Frame 3D Object Detection by Simulating Multi-Frame
  Point Clouds
Boosting Single-Frame 3D Object Detection by Simulating Multi-Frame Point Clouds
Wu Zheng
Li Jiang
Fanbin Lu
Yangyang Ye
Chi-Wing Fu
3DPC
ObjD
29
9
0
03 Jul 2022
Boosting 3D Object Detection by Simulating Multimodality on Point Clouds
Boosting 3D Object Detection by Simulating Multimodality on Point Clouds
Wu Zheng
Ming-Hong Hong
Li Jiang
Chi-Wing Fu
3DPC
19
30
0
30 Jun 2022
Evaluation-oriented Knowledge Distillation for Deep Face Recognition
Evaluation-oriented Knowledge Distillation for Deep Face Recognition
Y. Huang
Jiaxiang Wu
Xingkun Xu
Shouhong Ding
CVBM
11
32
0
06 Jun 2022
Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation
Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation
Yuenan Hou
Xinge Zhu
Yuexin Ma
Chen Change Loy
Yikang Li
3DPC
23
157
0
05 Jun 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
16
46
0
28 May 2022
PointDistiller: Structured Knowledge Distillation Towards Efficient and
  Compact 3D Detection
PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection
Linfeng Zhang
Runpei Dong
Hung-Shuo Tai
Kaisheng Ma
3DPC
70
46
0
23 May 2022
Knowledge Distillation via the Target-aware Transformer
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
20
104
0
22 May 2022
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
27
7
0
13 May 2022
Generalized Knowledge Distillation via Relationship Matching
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
26
169
0
14 Apr 2022
CoupleFace: Relation Matters for Face Recognition Distillation
CoupleFace: Relation Matters for Face Recognition Distillation
Jiaheng Liu
Haoyu Qin
Yichao Wu
Jinyang Guo
Ding Liang
Ke Xu
CVBM
19
19
0
12 Apr 2022
ShowFace: Coordinated Face Inpainting with Memory-Disentangled
  Refinement Networks
ShowFace: Coordinated Face Inpainting with Memory-Disentangled Refinement Networks
Zhuo Wu
Xingqun Qi
Zijian Wang
Wanting Zhou
Kun Yuan
Muyi Sun
Zhenan Sun
CVBM
13
3
0
06 Apr 2022
Knowledge Distillation with the Reused Teacher Classifier
Knowledge Distillation with the Reused Teacher Classifier
Defang Chen
Jianhan Mei
Hailin Zhang
C. Wang
Yan Feng
Chun-Yen Chen
25
164
0
26 Mar 2022
A Cross-Domain Approach for Continuous Impression Recognition from
  Dyadic Audio-Visual-Physio Signals
A Cross-Domain Approach for Continuous Impression Recognition from Dyadic Audio-Visual-Physio Signals
Yuanchao Li
Catherine Lai
11
1
0
25 Mar 2022
Channel Self-Supervision for Online Knowledge Distillation
Channel Self-Supervision for Online Knowledge Distillation
Shixi Fan
Xuan Cheng
Xiaomin Wang
Chun Yang
Pan Deng
Minghui Liu
Jiali Deng
Meilin Liu
13
1
0
22 Mar 2022
SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for
  Lightweight Skin Lesion Classification Using Dermoscopic Images
SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images
Yongwei Wang
Yuheng Wang
Tim K. Lee
C. Miao
Z. J. Wang
13
74
0
22 Mar 2022
Unsupervised Lifelong Person Re-identification via Contrastive Rehearsal
Unsupervised Lifelong Person Re-identification via Contrastive Rehearsal
Hao Chen
Benoit Lagadec
F. Brémond
CLL
29
6
0
12 Mar 2022
Wavelet Knowledge Distillation: Towards Efficient Image-to-Image
  Translation
Wavelet Knowledge Distillation: Towards Efficient Image-to-Image Translation
Linfeng Zhang
Xin Chen
Xiaobing Tu
Pengfei Wan
N. Xu
Kaisheng Ma
16
61
0
12 Mar 2022
Knowledge Distillation as Efficient Pre-training: Faster Convergence,
  Higher Data-efficiency, and Better Transferability
Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-efficiency, and Better Transferability
Ruifei He
Shuyang Sun
Jihan Yang
Song Bai
Xiaojuan Qi
22
35
0
10 Mar 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Learn From the Past: Experience Ensemble Knowledge Distillation
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
25
4
0
25 Feb 2022
Anomaly Detection via Reverse Distillation from One-Class Embedding
Anomaly Detection via Reverse Distillation from One-Class Embedding
Hanqiu Deng
Xingyu Li
UQCV
104
446
0
26 Jan 2022
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from
  a Single Image
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image
Yuki M. Asano
Aaqib Saeed
19
7
0
01 Dec 2021
Arch-Net: Model Distillation for Architecture Agnostic Model Deployment
Arch-Net: Model Distillation for Architecture Agnostic Model Deployment
Weixin Xu
Zipeng Feng
Shuangkang Fang
Song Yuan
Yi Yang
Shuchang Zhou
MQ
16
1
0
01 Nov 2021
Object DGCNN: 3D Object Detection using Dynamic Graphs
Object DGCNN: 3D Object Detection using Dynamic Graphs
Yue Wang
Justin Solomon
3DPC
143
103
0
13 Oct 2021
Cross-modal Knowledge Distillation for Vision-to-Sensor Action
  Recognition
Cross-modal Knowledge Distillation for Vision-to-Sensor Action Recognition
Jianyuan Ni
Raunak Sarbajna
Yang Liu
A. Ngu
Yan Yan
HAI
11
35
0
08 Oct 2021
Partial to Whole Knowledge Distillation: Progressive Distilling
  Decomposed Knowledge Boosts Student Better
Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better
Xuanyang Zhang
X. Zhang
Jian-jun Sun
19
1
0
26 Sep 2021
Weakly-Supervised Monocular Depth Estimationwith Resolution-Mismatched
  Data
Weakly-Supervised Monocular Depth Estimationwith Resolution-Mismatched Data
Jialei Xu
Yuanchao Bai
Xianming Liu
Junjun Jiang
Xiangyang Ji
MDE
37
5
0
23 Sep 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Adversarial Robustness for Unsupervised Domain Adaptation
Adversarial Robustness for Unsupervised Domain Adaptation
Muhammad Awais
Fengwei Zhou
Hang Xu
Lanqing Hong
Ping Luo
Sung-Ho Bae
Zhenguo Li
14
39
0
02 Sep 2021
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Bo-wen Li
Xinyang Jiang
Donglin Bai
Yuge Zhang
Ningxin Zheng
Xuanyi Dong
Lu Liu
Yuqing Yang
Dongsheng Li
12
10
0
30 Aug 2021
CoCo DistillNet: a Cross-layer Correlation Distillation Network for
  Pathological Gastric Cancer Segmentation
CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation
Wenxuan Zou
Muyi Sun
19
9
0
27 Aug 2021
Unsupervised Domain-adaptive Hash for Networks
Unsupervised Domain-adaptive Hash for Networks
Tao He
Lianli Gao
Jingkuan Song
Yuan-Fang Li
13
1
0
20 Aug 2021
Hierarchical Self-supervised Augmented Knowledge Distillation
Hierarchical Self-supervised Augmented Knowledge Distillation
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
SSL
22
75
0
29 Jul 2021
Double Similarity Distillation for Semantic Image Segmentation
Double Similarity Distillation for Semantic Image Segmentation
Yingchao Feng
Xian Sun
Wenhui Diao
Jihao Li
Xin Gao
10
62
0
19 Jul 2021
Categorical Relation-Preserving Contrastive Knowledge Distillation for
  Medical Image Classification
Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification
Xiaohan Xing
Yuenan Hou
Han Li
Yixuan Yuan
Hongsheng Li
M. Meng
VLM
23
35
0
07 Jul 2021
DnS: Distill-and-Select for Efficient and Accurate Video Indexing and
  Retrieval
DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval
Giorgos Kordopatis-Zilos
Christos Tzelepis
Symeon Papadopoulos
I. Kompatsiaris
Ioannis Patras
22
33
0
24 Jun 2021
Co-advise: Cross Inductive Bias Distillation
Co-advise: Cross Inductive Bias Distillation
Sucheng Ren
Zhengqi Gao
Tianyu Hua
Zihui Xue
Yonglong Tian
Shengfeng He
Hang Zhao
42
53
0
23 Jun 2021
Distilling EEG Representations via Capsules for Affective Computing
Distilling EEG Representations via Capsules for Affective Computing
Guangyi Zhang
Ali Etemad
22
16
0
30 Apr 2021
Semantic Relation Preserving Knowledge Distillation for Image-to-Image
  Translation
Semantic Relation Preserving Knowledge Distillation for Image-to-Image Translation
Zeqi Li
R. Jiang
P. Aarabi
GAN
VLM
25
28
0
30 Apr 2021
Piggyback GAN: Efficient Lifelong Learning for Image Conditioned
  Generation
Piggyback GAN: Efficient Lifelong Learning for Image Conditioned Generation
Mengyao Zhai
Lei Chen
Jiawei He
Megha Nawhal
Frederick Tung
Greg Mori
CLL
22
27
0
24 Apr 2021
Visualizing Adapted Knowledge in Domain Transfer
Visualizing Adapted Knowledge in Domain Transfer
Yunzhong Hou
Liang Zheng
111
54
0
20 Apr 2021
Distill on the Go: Online knowledge distillation in self-supervised
  learning
Distill on the Go: Online knowledge distillation in self-supervised learning
Prashant Bhat
Elahe Arani
Bahram Zonooz
SSL
6
28
0
20 Apr 2021
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
419
0
19 Apr 2021
Previous
123
Next