ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.14554
  4. Cited By
A Selective Survey on Versatile Knowledge Distillation Paradigm for
  Neural Network Models

A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models

30 November 2020
J. Ku
Jihun Oh
Youngyoon Lee
Gaurav Pooniwala
Sangjeong Lee
ArXiv (abs)PDFHTML

Papers citing "A Selective Survey on Versatile Knowledge Distillation Paradigm for Neural Network Models"

2 / 2 papers shown
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Pohsun Feng
...
Yujiao Shi
Qian Niu
Cheng Fei
Keyu Chen
Ming Liu
VLM
429
4
0
18 Apr 2025
Generalization in Neural Networks: A Broad Survey
Generalization in Neural Networks: A Broad SurveyNeurocomputing (Neurocomputing), 2022
Chris Rohlfs
OODAI4CE
340
29
0
04 Sep 2022
1
Page 1 of 1