Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2310.02572
Cited By
Improving Knowledge Distillation with Teacher's Explanation
4 October 2023
S. Chowdhury
Ben Liang
A. Tizghadam
Ilijc Albanese
FAtt
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Improving Knowledge Distillation with Teacher's Explanation"
3 / 3 papers shown
Title
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.2K
1,202
0
23 Oct 2019
A Unified Approach to Interpreting Model Predictions
Scott M. Lundberg
Su-In Lee
FAtt
3.0K
28,723
0
22 May 2017
"Why Should I Trust You?": Explaining the Predictions of Any Classifier
Marco Tulio Ribeiro
Sameer Singh
Carlos Guestrin
FAtt
FaML
2.2K
19,420
0
16 Feb 2016
1