Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.17328
Cited By
Dual-Space Knowledge Distillation for Large Language Models
25 June 2024
Songming Zhang
Xue Zhang
Zengkui Sun
Yufeng Chen
Jinan Xu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Dual-Space Knowledge Distillation for Large Language Models"
3 / 3 papers shown
Title
Knowledge Distillation of Domain-adapted LLMs for Question-Answering in Telecom
Rishika Sen
Sujoy Roychowdhury
Sumit Soman
H. G. Ranjani
Srikhetra Mohanty
54
0
0
28 Apr 2025
Cross-Tokenizer Distillation via Approximate Likelihood Matching
Benjamin Minixhofer
Ivan Vulić
E. Ponti
54
0
0
25 Mar 2025
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
220
3,054
0
23 Jan 2020
1