Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.03936
Cited By
Subclass Distillation
10 February 2020
Rafael Müller
Simon Kornblith
Geoffrey E. Hinton
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Subclass Distillation"
5 / 5 papers shown
Title
Linear Projections of Teacher Embeddings for Few-Class Distillation
Noel Loo
Fotis Iliopoulos
Wei Hu
Erik Vee
25
0
0
30 Sep 2024
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
22
12
0
28 Jan 2023
No Subclass Left Behind: Fine-Grained Robustness in Coarse-Grained Classification Problems
N. Sohoni
Jared A. Dunnmon
Geoffrey Angus
Albert Gu
Christopher Ré
16
240
0
25 Nov 2020
Anti-Distillation: Improving reproducibility of deep networks
G. Shamir
Lorenzo Coviello
34
20
0
19 Oct 2020
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
272
404
0
09 Apr 2018
1