Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.01775
Cited By
Feature-map-level Online Adversarial Knowledge Distillation
5 February 2020
Inseop Chung
Seonguk Park
Jangho Kim
Nojun Kwak
GAN
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Feature-map-level Online Adversarial Knowledge Distillation"
18 / 18 papers shown
Title
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data
Qing Xu
Min-man Wu
Xiaoli Li
K. Mao
Zhenghua Chen
14
5
0
07 Jul 2023
Performance-aware Approximation of Global Channel Pruning for Multitask CNNs
Hancheng Ye
Bo-Wen Zhang
Tao Chen
Jiayuan Fan
Bin Wang
24
18
0
21 Mar 2023
BD-KD: Balancing the Divergences for Online Knowledge Distillation
Ibtihel Amara
N. Sepahvand
B. Meyer
W. Gross
J. Clark
24
2
0
25 Dec 2022
Lightning Fast Video Anomaly Detection via Adversarial Knowledge Distillation
Florinel-Alin Croitoru
Nicolae-Cătălin Ristea
D. Dascalescu
Radu Tudor Ionescu
F. Khan
M. Shah
34
2
0
28 Nov 2022
SADT: Combining Sharpness-Aware Minimization with Self-Distillation for Improved Model Generalization
Masud An Nur Islam Fahim
Jani Boutellier
32
0
0
01 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide Image Classification
Linhao Qu
Xiao-Zhuo Luo
Manning Wang
Zhijian Song
WSOD
26
57
0
07 Oct 2022
Multi-domain Learning for Updating Face Anti-spoofing Models
Xiao Guo
Yaojie Liu
Anil Jain
Xiaoming Liu
CLL
CVBM
16
30
0
23 Aug 2022
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
31
50
0
23 Jul 2022
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
18
2
0
29 Nov 2021
Local-Selective Feature Distillation for Single Image Super-Resolution
Seonguk Park
Nojun Kwak
10
9
0
22 Nov 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
25
46
0
26 Mar 2021
Bringing AI To Edge: From Deep Learning's Perspective
Di Liu
Hao Kong
Xiangzhong Luo
Weichen Liu
Ravi Subramaniam
42
116
0
25 Nov 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
Feature Fusion for Online Mutual Knowledge Distillation
Jangho Kim
Minsung Hyun
Inseop Chung
Nojun Kwak
FedML
16
91
0
19 Apr 2019
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
190
473
0
12 Jun 2018
1