Rethinking Decoupled Knowledge Distillation: A Predictive Distribution PerspectiveIEEE Transactions on Neural Networks and Learning Systems (IEEE TNNLS), 2025 |
Diffusion-Assisted Distillation for Self-Supervised Graph Representation Learning with MLPsIEEE Transactions on Artificial Intelligence (IEEE TAI), 2025 |
The Role of Teacher Calibration in Knowledge DistillationIEEE Access (IEEE Access), 2025 |
Parameter-Free Logit Distillation via Sorting MechanismIEEE Signal Processing Letters (IEEE SPL), 2025 |
Expandable Residual Approximation for Knowledge DistillationIEEE Transactions on Neural Networks and Learning Systems (IEEE TNNLS), 2025 |
Sparse Logit Sampling: Accelerating Knowledge Distillation in LLMsAnnual Meeting of the Association for Computational Linguistics (ACL), 2025 |
Cyclic Contrastive Knowledge Transfer for Open-Vocabulary Object DetectionInternational Conference on Learning Representations (ICLR), 2025 |