Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2310.02572
Cited By
Improving Knowledge Distillation with Teacher's Explanation
4 October 2023
S. Chowdhury
Ben Liang
A. Tizghadam
Ilijc Albanese
FAtt
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Improving Knowledge Distillation with Teacher's Explanation"
3 / 3 papers shown
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.4K
1,214
0
23 Oct 2019
A Unified Approach to Interpreting Model Predictions
Scott M. Lundberg
Su-In Lee
FAtt
3.8K
29,870
0
22 May 2017
"Why Should I Trust You?": Explaining the Predictions of Any Classifier
Marco Tulio Ribeiro
Sameer Singh
Carlos Guestrin
FAtt
FaML
2.5K
19,924
0
16 Feb 2016
1
Page 1 of 1