Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2409.14904
Cited By
DSG-KD: Knowledge Distillation from Domain-Specific to General Language Models
23 September 2024
Sangyeon Cho
Jangyeong Jeon
Dongjoon Lee
Changhee Lee
Junyeong Kim
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DSG-KD: Knowledge Distillation from Domain-Specific to General Language Models"
1 / 1 papers shown
Title
KDH-MLTC: Knowledge Distillation for Healthcare Multi-Label Text Classification
Hajar Sakai
Sarah Lam
VLM
38
0
0
12 May 2025
1