Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.08532
Cited By
Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher
16 October 2021
Mehdi Rezagholizadeh
A. Jafari
Puneeth Salad
Pranav Sharma
Ali Saheb Pasand
A. Ghodsi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher"
5 / 5 papers shown
Title
Revisiting the Relationship between Adversarial and Clean Training: Why Clean Training Can Make Adversarial Training Better
MingWei Zhou
Xiaobing Pei
AAML
56
0
0
30 Mar 2025
HomoDistil: Homotopic Task-Agnostic Distillation of Pre-trained Transformers
Chen Liang
Haoming Jiang
Zheng Li
Xianfeng Tang
Bin Yin
Tuo Zhao
VLM
8
24
0
19 Feb 2023
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
12
12
0
28 Jan 2023
Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics
Prajjwal Bhargava
Aleksandr Drozd
Anna Rogers
85
101
0
04 Oct 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
1