Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2411.01547
Cited By
Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
3 November 2024
Chengting Yu
Fengzhao Zhang
Ruizhe Chen
Zuozhu Liu
Shurun Tan
Er-ping Li
Aili Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment"
2 / 2 papers shown
Title
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Tianyang Wang
...
Y. Zhang
Qian Niu
Benji Peng
Keyu Chen
Ming Liu
VLM
40
0
0
18 Apr 2025
Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement
Shu Yang
C. Yu
Lei Liu
Hanzhi Ma
Aili Wang
Erping Li
37
0
0
20 Mar 2025
1