Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2305.08096
Cited By
v1
v2 (latest)
Towards Understanding and Improving Knowledge Distillation for Neural Machine Translation
Annual Meeting of the Association for Computational Linguistics (ACL), 2023
14 May 2023
Songming Zhang
Yunlong Liang
Shuaibo Wang
Wenjuan Han
Jian Liu
Jinan Xu
Jinan Xu
Re-assign community
ArXiv (abs)
PDF
HTML
Github (736★)
Papers citing
"Towards Understanding and Improving Knowledge Distillation for Neural Machine Translation"
6 / 6 papers shown
Multi-Hypothesis Distillation of Multilingual Neural Translation Models for Low-Resource Languages
Aarón Galiano-Jiménez
Juan Antonio Pérez-Ortiz
F. Sánchez-Martínez
Víctor M. Sánchez-Cartagena
304
0
0
29 Jul 2025
Distilled Circuits: A Mechanistic Study of Internal Restructuring in Knowledge Distillation
Reilly Haskins
Benjamin Adams
348
0
0
16 May 2025
A Dual-Space Framework for General Knowledge Distillation of Large Language Models
Wei Wei
Songming Zhang
Yunlong Liang
Fandong Meng
Yufeng Chen
Jinan Xu
Jie Zhou
413
0
0
15 Apr 2025
Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models
Jun Rao
Xuebo Liu
Zepeng Lin
Liang Ding
Jing Li
Dacheng Tao
Min Zhang
403
3
0
19 Sep 2024
Towards Lifelong Learning of Large Language Models: A Survey
Junhao Zheng
Shengjie Qiu
Chengming Shi
Qianli Ma
KELM
CLL
384
86
0
10 Jun 2024
D
2
^2
2
TV: Dual Knowledge Distillation and Target-oriented Vision Modeling for Many-to-Many Multimodal Summarization
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023
Yunlong Liang
Fandong Meng
Jiaan Wang
Jinan Xu
Jinan Xu
Jie Zhou
VLM
370
13
0
22 May 2023
1
Page 1 of 1