Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.16004
Cited By
What Knowledge Gets Distilled in Knowledge Distillation?
31 May 2022
Utkarsh Ojha
Yuheng Li
Anirudh Sundara Rajan
Yingyu Liang
Yong Jae Lee
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"What Knowledge Gets Distilled in Knowledge Distillation?"
4 / 4 papers shown
Title
Provable Weak-to-Strong Generalization via Benign Overfitting
David X. Wu
A. Sahai
58
6
0
06 Oct 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
56
1
0
22 Apr 2024
Weight Averaging Improves Knowledge Distillation under Domain Shift
Valeriy Berezovskiy
Nikita Morozov
MoMe
19
1
0
20 Sep 2023
Adversarial Machine Learning at Scale
Alexey Kurakin
Ian Goodfellow
Samy Bengio
AAML
256
3,108
0
04 Nov 2016
1