Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2311.02849
Cited By
v1
v2 (latest)
Co-training and Co-distillation for Quality Improvement and Compression of Language Models
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023
6 November 2023
Hayeon Lee
Rui Hou
Jongpil Kim
Davis Liang
Hongbo Zhang
Sung Ju Hwang
Alexander Min
Re-assign community
ArXiv (abs)
PDF
HTML
HuggingFace (8 upvotes)
Papers citing
"Co-training and Co-distillation for Quality Improvement and Compression of Language Models"
0 / 0 papers shown
No papers found