Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1911.07471
Cited By
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
18 November 2019
Tiancheng Wen
Shenqi Lai
Xueming Qian
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Preparing Lessons: Improve Knowledge Distillation with Better Supervision"
4 / 4 papers shown
Title
Using Knowledge Distillation to improve interpretable models in a retail banking context
Maxime Biehler
Mohamed Guermazi
Célim Starck
38
2
0
30 Sep 2022
There is More than Meets the Eye: Self-Supervised Multi-Object Detection and Tracking with Sound by Distilling Multimodal Knowledge
Francisco Rivera Valverde
Juana Valeria Hurtado
Abhinav Valada
26
72
0
01 Mar 2021
Robustness and Diversity Seeking Data-Free Knowledge Distillation
Pengchao Han
Jihong Park
Shiqiang Wang
Yejun Liu
6
12
0
07 Nov 2020
Robust Student Network Learning
Tianyu Guo
Chang Xu
Shiyi He
Boxin Shi
Chao Xu
Dacheng Tao
OOD
32
30
0
30 Jul 2018
1