Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.11105
Cited By
Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing
23 September 2021
Haoyu He
Xingjian Shi
Jonas W. Mueller
Zha Sheng
Mu Li
George Karypis
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distiller: A Systematic Study of Model Distillation Methods in Natural Language Processing"
7 / 7 papers shown
Title
Towards Efficient and Explainable Hate Speech Detection via Model Distillation
Paloma Piot
Javier Parapar
83
173
0
18 Dec 2024
Sentence-Level or Token-Level? A Comprehensive Study on Knowledge Distillation
Jingxuan Wei
Linzhuang Sun
Yichong Leng
Xu Tan
Bihui Yu
Ruifeng Guo
45
3
0
23 Apr 2024
Rediscovering Hashed Random Projections for Efficient Quantization of Contextualized Sentence Embeddings
Ulf A. Hamster
Ji-Ung Lee
Alexander Geyken
Iryna Gurevych
21
0
0
13 Mar 2023
AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data
Nick Erickson
Jonas W. Mueller
Alexander Shirkov
Hang Zhang
Pedro Larroy
Mu Li
Alex Smola
LMTD
92
607
0
13 Mar 2020
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
231
4,469
0
23 Jan 2020
A Mutual Information Maximization Perspective of Language Representation Learning
Lingpeng Kong
Cyprien de Masson dÁutume
Wang Ling
Lei Yu
Zihang Dai
Dani Yogatama
SSL
214
165
0
18 Oct 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1