Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.16231
Cited By
Curriculum Temperature for Knowledge Distillation
29 November 2022
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Curriculum Temperature for Knowledge Distillation"
50 / 55 papers shown
Title
DNAD: Differentiable Neural Architecture Distillation
Xuan Rao
Bo Zhao
Derong Liu
21
1
0
25 Apr 2025
Analytical Softmax Temperature Setting from Feature Dimensions for Model- and Domain-Robust Classification
Tatsuhito Hasegawa
Shunsuke Sakai
35
0
0
22 Apr 2025
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks
Yuang Jia
Xiaojuan Shan
Jun-Xiong Xia
Guancheng Wan
Y. Zhang
Wenke Huang
Mang Ye
Stan Z. Li
42
0
0
01 Apr 2025
Sample-level Adaptive Knowledge Distillation for Action Recognition
Ping Li
Chenhao Ping
Wenxiao Wang
Mingli Song
49
0
0
01 Apr 2025
Adaptive Temperature Based on Logits Correlation in Knowledge Distillation
Kazuhiro Matsuyama
Usman Anjum
Satoko Matsuyama
Tetsuo Shoda
J. Zhan
55
0
0
12 Mar 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
68
0
0
28 Feb 2025
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
39
0
0
09 Feb 2025
TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models
Makoto Shing
Kou Misaki
Han Bao
Sho Yokoi
Takuya Akiba
VLM
51
1
0
28 Jan 2025
ECG-guided individual identification via PPG
Riling Wei
Hanjie Chen
Kelu Yao
Chuanguang Yang
Jun Wang
Chao Li
26
0
0
30 Dec 2024
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
69
1
0
11 Dec 2024
Toward Fair Graph Neural Networks Via Dual-Teacher Knowledge Distillation
Chengyu Li
Debo Cheng
Guixian Zhang
Yi Li
Shichao Zhang
73
0
0
30 Nov 2024
CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Zao Zhang
Huaming Chen
Pei Ning
Nan Yang
Dong Yuan
19
1
0
17 Oct 2024
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
27
0
0
05 Oct 2024
Fair4Free: Generating High-fidelity Fair Synthetic Samples using Data Free Distillation
Md Fahim Sikder
Daniel de Leng
Fredrik Heintz
24
1
0
02 Oct 2024
Collaborative Knowledge Distillation via a Learning-by-Education Node Community
Anestis Kaimakamidis
Ioannis Mademlis
Ioannis Pitas
13
0
0
30 Sep 2024
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
29
0
0
30 Sep 2024
Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation
Chaomin Shen
Yaomin Huang
Haokun Zhu
Jinsong Fan
Guixu Zhang
16
0
0
27 Sep 2024
Kendall's
τ
τ
τ
Coefficient for Logits Distillation
Yuchen Guan
Runxi Cheng
Kang Liu
Chun Yuan
16
0
0
26 Sep 2024
Cascade Prompt Learning for Vision-Language Model Adaptation
Ge Wu
Xin Zhang
Zheng Li
Zhaowei Chen
Jiajun Liang
Jian Yang
Xiang Li
VLM
17
6
0
26 Sep 2024
Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios
Xinlei Huang
Jialiang Tang
Xubin Zheng
Jinjia Zhou
Wenxin Yu
Ning Jiang
16
0
0
12 Sep 2024
LoCa: Logit Calibration for Knowledge Distillation
Runming Yang
Taiqiang Wu
Yujiu Yang
25
0
0
07 Sep 2024
Computer Vision Model Compression Techniques for Embedded Systems: A Survey
Alexandre Lopes
Fernando Pereira dos Santos
D. Oliveira
Mauricio Schiezaro
Hélio Pedrini
21
5
0
15 Aug 2024
A Review of Pseudo-Labeling for Computer Vision
Patrick Kage
Jay C. Rothenberger
Pavlos Andreadis
Dimitrios I. Diochnos
VLM
21
3
0
13 Aug 2024
DisCoM-KD: Cross-Modal Knowledge Distillation via Disentanglement Representation and Adversarial Learning
Dino Ienco
C. Dantas
25
1
0
05 Aug 2024
Text2LiDAR: Text-guided LiDAR Point Cloud Generation via Equirectangular Transformer
Yang Wu
Kaihua Zhang
Jianjun Qian
Jin Xie
Jian Yang
DiffM
34
4
0
29 Jul 2024
AMD: Automatic Multi-step Distillation of Large-scale Vision Models
Cheng Han
Qifan Wang
S. Dianat
Majid Rabbani
Raghuveer M. Rao
Yi Fang
Qiang Guan
Lifu Huang
Dongfang Liu
VLM
25
4
0
05 Jul 2024
Instance Temperature Knowledge Distillation
Zhengbo Zhang
Yuxi Zhou
Jia Gong
Jun Liu
Zhigang Tu
14
2
0
27 Jun 2024
Cross-Domain Knowledge Distillation for Low-Resolution Human Pose Estimation
Zejun Gu
Zhongming Zhao
Henghui Ding
Hao Shen
Zhao Zhang
De-Shuang Huang
24
0
0
19 May 2024
AdaKD: Dynamic Knowledge Distillation of ASR models using Adaptive Loss Weighting
Shreyan Ganguly
Roshan Nayak
Rakshith Rao
Ujan Deb
AP Prathosh
19
1
0
11 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
44
1
0
22 Apr 2024
Dynamic Temperature Knowledge Distillation
Yukang Wei
Yu Bai
22
4
0
19 Apr 2024
To Cool or not to Cool? Temperature Network Meets Large Foundation Models via DRO
Zi-Hao Qiu
Siqi Guo
Mao Xu
Tuo Zhao
Lijun Zhang
Tianbao Yang
AI4TS
AI4CE
32
2
0
06 Apr 2024
Improve Knowledge Distillation via Label Revision and Data Selection
Weichao Lan
Yiu-ming Cheung
Qing Xu
Buhua Liu
Zhikai Hu
Mengke Li
Zhenghua Chen
16
2
0
03 Apr 2024
Ranking Distillation for Open-Ended Video Question Answering with Insufficient Labels
Tianming Liang
Chaolei Tan
Beihao Xia
Wei-Shi Zheng
Jianfang Hu
25
1
0
21 Mar 2024
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
14
0
0
08 Mar 2024
PromptKD: Unsupervised Prompt Distillation for Vision-Language Models
Zheng Li
Xiang Li
Xinyi Fu
Xing Zhang
Weiqiang Wang
Shuo Chen
Jian Yang
VLM
19
33
0
05 Mar 2024
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
24
55
0
03 Mar 2024
Knowledge Distillation Based on Transformed Teacher Matching
Kaixiang Zheng
En-Hui Yang
11
16
0
17 Feb 2024
Bi-CryptoNets: Leveraging Different-Level Privacy for Encrypted Inference
Man-Jie Yuan
Zheng Zou
Wei Gao
6
0
0
02 Feb 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
22
12
0
16 Jan 2024
Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction
Zhaoge Liu
Xiaohao Xu
Yunkang Cao
Weiming Shen
VLM
8
0
0
16 Jan 2024
Balanced Multi-modal Federated Learning via Cross-Modal Infiltration
Yunfeng Fan
Wenchao Xu
Haozhao Wang
Jiaqi Zhu
Song Guo
12
0
0
31 Dec 2023
SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data
Hatef Otroshi
Anjith George
S´ebastien Marcel
17
10
0
28 Aug 2023
CLIP-KD: An Empirical Study of CLIP Model Distillation
Chuanguang Yang
Zhulin An
Libo Huang
Junyu Bi
Xinqiang Yu
Hansheng Yang
Boyu Diao
Yongjun Xu
VLM
16
25
0
24 Jul 2023
Mitigating Accuracy-Robustness Trade-off via Balanced Multi-Teacher Adversarial Distillation
Shiji Zhao
Xizhe Wang
Xingxing Wei
AAML
32
8
0
28 Jun 2023
CrossKD: Cross-Head Knowledge Distillation for Object Detection
Jiabao Wang
Yuming Chen
Zhaohui Zheng
Xiang Li
Ming-Ming Cheng
Qibin Hou
35
30
0
20 Jun 2023
GripRank: Bridging the Gap between Retrieval and Generation via the Generative Knowledge Improved Passage Ranking
Jiaqi Bai
Hongcheng Guo
Jiaheng Liu
Jian Yang
Xinnian Liang
Zhao Yan
Zhoujun Li
RALM
11
14
0
29 May 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
21
19
0
22 May 2023
Lightweight Self-Knowledge Distillation with Multi-source Information Fusion
Xucong Wang
Pengchao Han
Lei Guo
20
1
0
16 May 2023
Curricular Object Manipulation in LiDAR-based Object Detection
Ziyue Zhu
Qiang Meng
Xiao Wang
Ke Wang
Liujiang Yan
Jian Yang
3DPC
10
9
0
09 Apr 2023
1
2
Next