ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.06418
  4. Cited By
LightMBERT: A Simple Yet Effective Method for Multilingual BERT
  Distillation

LightMBERT: A Simple Yet Effective Method for Multilingual BERT Distillation

11 March 2021
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Chen
Linlin Li
Fang Wang
Qun Liu
ArXivPDFHTML

Papers citing "LightMBERT: A Simple Yet Effective Method for Multilingual BERT Distillation"

3 / 3 papers shown
Title
CAMeMBERT: Cascading Assistant-Mediated Multilingual BERT
CAMeMBERT: Cascading Assistant-Mediated Multilingual BERT
Dan DeGenaro
Jugal Kalita
22
0
0
22 Dec 2022
Too Brittle To Touch: Comparing the Stability of Quantization and
  Distillation Towards Developing Lightweight Low-Resource MT Models
Too Brittle To Touch: Comparing the Stability of Quantization and Distillation Towards Developing Lightweight Low-Resource MT Models
Harshita Diddee
Sandipan Dandapat
Monojit Choudhury
T. Ganu
Kalika Bali
27
5
0
27 Oct 2022
Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Andrei-Marius Avram
Darius Catrina
Dumitru-Clementin Cercel
Mihai Dascualu
Traian Rebedea
Vasile Puaics
Dan Tufics
22
12
0
23 Dec 2021
1