ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.03936
  4. Cited By
Subclass Distillation

Subclass Distillation

10 February 2020
Rafael Müller
Simon Kornblith
Geoffrey E. Hinton
ArXivPDFHTML

Papers citing "Subclass Distillation"

5 / 5 papers shown
Title
Linear Projections of Teacher Embeddings for Few-Class Distillation
Linear Projections of Teacher Embeddings for Few-Class Distillation
Noel Loo
Fotis Iliopoulos
Wei Hu
Erik Vee
25
0
0
30 Sep 2024
Supervision Complexity and its Role in Knowledge Distillation
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
22
12
0
28 Jan 2023
No Subclass Left Behind: Fine-Grained Robustness in Coarse-Grained
  Classification Problems
No Subclass Left Behind: Fine-Grained Robustness in Coarse-Grained Classification Problems
N. Sohoni
Jared A. Dunnmon
Geoffrey Angus
Albert Gu
Christopher Ré
16
240
0
25 Nov 2020
Anti-Distillation: Improving reproducibility of deep networks
Anti-Distillation: Improving reproducibility of deep networks
G. Shamir
Lorenzo Coviello
34
20
0
19 Oct 2020
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
269
404
0
09 Apr 2018
1