Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2308.13958
Cited By
Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning
26 August 2023
Apoorv Dankar
Adeem Jassani
Kartikaeya Kumar
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning"
2 / 2 papers shown
Title
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
50
0
0
25 Jun 2024
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1