Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1911.05329
Cited By
Knowledge Representing: Efficient, Sparse Representation of Prior Knowledge for Knowledge Distillation
13 November 2019
Junjie Liu
Dongchao Wen
Hongxing Gao
Wei Tao
Tse-Wei Chen
Kinya Osa
Masami Kato
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Representing: Efficient, Sparse Representation of Prior Knowledge for Knowledge Distillation"
4 / 4 papers shown
Title
Choosing Wisely and Learning Deeply: Selective Cross-Modality Distillation via CLIP for Domain Generalization
Jixuan Leng
Yijiang Li
Haohan Wang
VLM
31
0
0
26 Nov 2023
Dataset Distillation: A Comprehensive Review
Ruonan Yu
Songhua Liu
Xinchao Wang
DD
39
121
0
17 Jan 2023
Dynamic Group Convolution for Accelerating Convolutional Neural Networks
Z. Su
Linpu Fang
Wenxiong Kang
D. Hu
M. Pietikäinen
Li Liu
6
44
0
08 Jul 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
1