ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.13958
  4. Cited By
Improving Knowledge Distillation for BERT Models: Loss Functions,
  Mapping Methods, and Weight Tuning

Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning

26 August 2023
Apoorv Dankar
Adeem Jassani
Kartikaeya Kumar
ArXivPDFHTML

Papers citing "Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning"

2 / 2 papers shown
Title
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge
  Distillation
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
47
0
0
25 Jun 2024
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1