ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.08360
  4. Cited By
Knowledge Distillation from Single to Multi Labels: an Empirical Study

Knowledge Distillation from Single to Multi Labels: an Empirical Study

15 March 2023
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
    VLM
ArXiv (abs)PDFHTML

Papers citing "Knowledge Distillation from Single to Multi Labels: an Empirical Study"

3 / 3 papers shown
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Pohsun Feng
...
Yujiao Shi
Qian Niu
Cheng Fei
Keyu Chen
Ming Liu
VLM
335
4
0
18 Apr 2025
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge
  Distillation
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
359
0
0
25 Jun 2024
Contrastive Representation Distillation
Contrastive Representation DistillationInternational Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.3K
1,209
0
23 Oct 2019
1