Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.02775
Cited By
MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models
3 July 2024
Ying Zhang
Ziheng Yang
Shufan Ji
KELM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models"
2 / 2 papers shown
Title
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
225
575
0
12 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,950
0
20 Apr 2018
1