Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.07839
Cited By
Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation
16 May 2020
Le Thanh Nguyen-Meidine
Eric Granger
M. Kiran
Jose Dolz
Louis-Antoine Blais-Morin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation"
9 / 9 papers shown
Title
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data
Qing Xu
Min-man Wu
Xiaoli Li
K. Mao
Zhenghua Chen
19
5
0
07 Jul 2023
WIDER & CLOSER: Mixture of Short-channel Distillers for Zero-shot Cross-lingual Named Entity Recognition
Jun-Yu Ma
Beiduo Chen
Jia-Chen Gu
Zhen-Hua Ling
Wu Guo
Quan Liu
Zhigang Chen
Cong Liu
31
10
0
07 Dec 2022
Memory Consistent Unsupervised Off-the-Shelf Model Adaptation for Source-Relaxed Medical Image Segmentation
Xiaofeng Liu
Fangxu Xing
G. El Fakhri
Jonghye Woo
OOD
37
30
0
16 Sep 2022
Factorizing Knowledge in Neural Networks
Xingyi Yang
Jingwen Ye
Xinchao Wang
MoMe
36
121
0
04 Jul 2022
Towards Accurate Cross-Domain In-Bed Human Pose Estimation
Mohamed Afham
Udith Haputhanthri
Jathurshan Pradeepkumar
Mithunjha Anandakumar
Ashwin De Silva
Chamira U. S. Edussooriya
3DH
32
8
0
07 Oct 2021
FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning
Minhan Kim
Shahroz Tariq
Simon S. Woo
21
86
0
28 May 2021
Unsupervised Domain Adaptation in the Dissimilarity Space for Person Re-identification
Djebril Mekhazni
Amran Bhuiyan
G. Ekladious
Eric Granger
OOD
86
93
0
27 Jul 2020
Unsupervised Multi-Target Domain Adaptation Through Knowledge Distillation
Le Thanh Nguyen-Meidine
Atif Bela
M. Kiran
Jose Dolz
Louis-Antoine Blais-Morin
Eric Granger
27
81
0
14 Jul 2020
1