Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.14649
Cited By
Knowledge Distillation: Bad Models Can Be Good Role Models
28 March 2022
Gal Kaplun
Eran Malach
Preetum Nakkiran
Shai Shalev-Shwartz
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation: Bad Models Can Be Good Role Models"
3 / 3 papers shown
Title
Trans-LoRA
\textit{Trans-LoRA}
Trans-LoRA
: towards data-free Transferable Parameter Efficient Finetuning
Runqian Wang
Soumya Ghosh
David D. Cox
Diego Antognini
Aude Oliva
Rogerio Feris
Leonid Karlinsky
32
1
0
27 May 2024
Robust Knowledge Distillation from RNN-T Models With Noisy Training Labels Using Full-Sum Loss
Mohammad Zeineldeen
Kartik Audhkhasi
M. Baskar
Bhuvana Ramabhadran
24
2
0
10 Mar 2023
Meta Pseudo Labels
Hieu H. Pham
Zihang Dai
Qizhe Xie
Minh-Thang Luong
Quoc V. Le
VLM
253
656
0
23 Mar 2020
1