ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.02432
  4. Cited By
KNOT: Knowledge Distillation using Optimal Transport for Solving NLP
  Tasks

KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks

6 October 2021
Rishabh Bhardwaj
Tushar Vaidya
Soujanya Poria
    OT
    FedML
ArXivPDFHTML

Papers citing "KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks"

3 / 3 papers shown
Title
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs
Nicolas Boizard
Kevin El Haddad
C´eline Hudelot
Pierre Colombo
55
14
0
28 Jan 2025
Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models
Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models
Xiao Cui
Mo Zhu
Yulei Qin
Liang Xie
Wengang Zhou
H. Li
70
2
0
19 Dec 2024
A Survey on Bias and Fairness in Machine Learning
A Survey on Bias and Fairness in Machine Learning
Ninareh Mehrabi
Fred Morstatter
N. Saxena
Kristina Lerman
Aram Galstyan
SyDa
FaML
283
4,143
0
23 Aug 2019
1