ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.16524
  4. Cited By
The Privileged Students: On the Value of Initialization in Multilingual
  Knowledge Distillation

The Privileged Students: On the Value of Initialization in Multilingual Knowledge Distillation

24 June 2024
Haryo Akbarianto Wibowo
Thamar Solorio
Alham Fikri Aji
ArXivPDFHTML

Papers citing "The Privileged Students: On the Value of Initialization in Multilingual Knowledge Distillation"

2 / 2 papers shown
Title
Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation
Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation
Jan Christian Blaise Cruz
Alham Fikri Aji
41
1
0
22 Jan 2025
XLM-T: Multilingual Language Models in Twitter for Sentiment Analysis
  and Beyond
XLM-T: Multilingual Language Models in Twitter for Sentiment Analysis and Beyond
Francesco Barbieri
Luis Espinosa Anke
Jose Camacho-Collados
79
213
0
25 Apr 2021
1