Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.12586
Cited By
Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights
19 September 2024
Mohamad Ballout
U. Krumnack
Gunther Heidemann
Kai-Uwe Kühnberger
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights"
1 / 1 papers shown
Title
Honey, I Shrunk the Language Model: Impact of Knowledge Distillation Methods on Performance and Explainability
Daniel Hendriks
Philipp Spitzer
Niklas Kühl
G. Satzger
27
1
0
22 Apr 2025
1