Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.02432
Cited By
KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks
6 October 2021
Rishabh Bhardwaj
Tushar Vaidya
Soujanya Poria
OT
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks"
3 / 3 papers shown
Title
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs
Nicolas Boizard
Kevin El Haddad
C´eline Hudelot
Pierre Colombo
55
14
0
28 Jan 2025
Multi-Level Optimal Transport for Universal Cross-Tokenizer Knowledge Distillation on Language Models
Xiao Cui
Mo Zhu
Yulei Qin
Liang Xie
Wengang Zhou
H. Li
70
2
0
19 Dec 2024
A Survey on Bias and Fairness in Machine Learning
Ninareh Mehrabi
Fred Morstatter
N. Saxena
Kristina Lerman
Aram Galstyan
SyDa
FaML
283
4,143
0
23 Aug 2019
1