Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.09674
Cited By
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation
19 October 2021
Sumanth Chennupati
Mohammad Mahdi Kamani
Zhongwei Cheng
Lin Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation"
6 / 6 papers shown
Title
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Tianyang Wang
...
Y. Zhang
Qian Niu
Benji Peng
Keyu Chen
Ming Liu
VLM
40
0
0
18 Apr 2025
KnFu: Effective Knowledge Fusion
Seyed Jamal Seyed-Mohammadi
Kawa Atapour
J. Abouei
Arash Mohammadi
FedML
14
2
0
18 Mar 2024
Stochastic Multiple Target Sampling Gradient Descent
Hoang Phan
Ngoc N. Tran
Trung Le
Toan M. Tran
Nhat Ho
Dinh Q. Phung
11
14
0
04 Jun 2022
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
172
1,018
0
06 Mar 2020
Bilevel Programming for Hyperparameter Optimization and Meta-Learning
Luca Franceschi
P. Frasconi
Saverio Salzo
Riccardo Grazzi
Massimiliano Pontil
96
714
0
13 Jun 2018
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
436
0
12 Jun 2018
1