Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2411.06839
Cited By
LLM-NEO: Parameter Efficient Knowledge Distillation for Large Language Models
11 November 2024
Runming Yang
Taiqiang Wu
Jiahao Wang
Pengfei Hu
Ngai Wong
Yujiu Yang
Yujiu Yang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"LLM-NEO: Parameter Efficient Knowledge Distillation for Large Language Models"
Title
No papers