Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.08551
Cited By
HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression
16 October 2021
Chenhe Dong
Yaliang Li
Ying Shen
Minghui Qiu
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression"
4 / 4 papers shown
Title
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
214
505
0
12 Sep 2019
Text Summarization with Pretrained Encoders
Yang Liu
Mirella Lapata
MILM
245
1,417
0
22 Aug 2019
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
234
11,568
0
09 Mar 2017
Forward and Reverse Gradient-Based Hyperparameter Optimization
Luca Franceschi
Michele Donini
P. Frasconi
Massimiliano Pontil
109
370
0
06 Mar 2017
1