ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.13964
  4. Cited By
Regularizing Class-wise Predictions via Self-knowledge Distillation

Regularizing Class-wise Predictions via Self-knowledge Distillation

31 March 2020
Sukmin Yun
Jongjin Park
Kimin Lee
Jinwoo Shin
ArXivPDFHTML

Papers citing "Regularizing Class-wise Predictions via Self-knowledge Distillation"

36 / 36 papers shown
Title
Learning Critically: Selective Self Distillation in Federated Learning on Non-IID Data
Learning Critically: Selective Self Distillation in Federated Learning on Non-IID Data
Yuting He
Yiqiang Chen
Xiaodong Yang
H. Yu
Yi-Hua Huang
Yang Gu
FedML
53
20
0
20 Apr 2025
JPEG Inspired Deep Learning
JPEG Inspired Deep Learning
Ahmed H. Salamah
Kaixiang Zheng
Yiwen Liu
E. Yang
27
0
0
09 Oct 2024
Revisiting Confidence Estimation: Towards Reliable Failure Prediction
Revisiting Confidence Estimation: Towards Reliable Failure Prediction
Fei Zhu
Xu-Yao Zhang
Zhen Cheng
Cheng-Lin Liu
UQCV
44
10
0
05 Mar 2024
P2Seg: Pointly-supervised Segmentation via Mutual Distillation
P2Seg: Pointly-supervised Segmentation via Mutual Distillation
Zipeng Wang
Xuehui Yu
Xumeng Han
Wenwen Yu
Zhixun Huang
Jianbin Jiao
Zhenjun Han
23
0
0
18 Jan 2024
Learning Contrastive Self-Distillation for Ultra-Fine-Grained Visual
  Categorization Targeting Limited Samples
Learning Contrastive Self-Distillation for Ultra-Fine-Grained Visual Categorization Targeting Limited Samples
Ziye Fang
Xin Jiang
Hao Tang
Zechao Li
30
12
0
10 Nov 2023
Towards Generalized Multi-stage Clustering: Multi-view Self-distillation
Towards Generalized Multi-stage Clustering: Multi-view Self-distillation
Jiatai Wang
Zhiwei Xu
Xin Wang
Tao Li
11
1
0
29 Oct 2023
CORSD: Class-Oriented Relational Self Distillation
CORSD: Class-Oriented Relational Self Distillation
Muzhou Yu
S. Tan
Kailu Wu
Runpei Dong
Linfeng Zhang
Kaisheng Ma
18
0
0
28 Apr 2023
FSNet: Redesign Self-Supervised MonoDepth for Full-Scale Depth
  Prediction for Autonomous Driving
FSNet: Redesign Self-Supervised MonoDepth for Full-Scale Depth Prediction for Autonomous Driving
Yuxuan Liu
Zhenhua Xu
Huaiyang Huang
Lujia Wang
Ming-Yu Liu
MDE
38
3
0
21 Apr 2023
From Knowledge Distillation to Self-Knowledge Distillation: A Unified
  Approach with Normalized Loss and Customized Soft Labels
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels
Zhendong Yang
Ailing Zeng
Zhe Li
Tianke Zhang
Chun Yuan
Yu Li
21
72
0
23 Mar 2023
Rethinking Confidence Calibration for Failure Prediction
Rethinking Confidence Calibration for Failure Prediction
Fei Zhu
Zhen Cheng
Xu-Yao Zhang
Cheng-Lin Liu
UQCV
14
39
0
06 Mar 2023
Rethinking Soft Label in Label Distribution Learning Perspective
Rethinking Soft Label in Label Distribution Learning Perspective
Seungbum Hong
Jihun Yoon
Bogyu Park
Min-Kook Choi
31
0
0
31 Jan 2023
SADT: Combining Sharpness-Aware Minimization with Self-Distillation for
  Improved Model Generalization
SADT: Combining Sharpness-Aware Minimization with Self-Distillation for Improved Model Generalization
Masud An Nur Islam Fahim
Jani Boutellier
32
0
0
01 Nov 2022
Respecting Transfer Gap in Knowledge Distillation
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
21
23
0
23 Oct 2022
Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide
  Image Classification
Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide Image Classification
Linhao Qu
Xiao-Zhuo Luo
Manning Wang
Zhijian Song
WSOD
26
57
0
07 Oct 2022
ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature
  Entropy State
ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State
Xinshao Wang
Yang Hua
Elyor Kodirov
S. Mukherjee
David A. Clifton
N. Robertson
13
6
0
30 Jun 2022
Improving Generalization of Metric Learning via Listwise
  Self-distillation
Improving Generalization of Metric Learning via Listwise Self-distillation
Zelong Zeng
Fan Yang
Z. Wang
Shiníchi Satoh
FedML
28
1
0
17 Jun 2022
Contrastive Learning for Improving ASR Robustness in Spoken Language
  Understanding
Contrastive Learning for Improving ASR Robustness in Spoken Language Understanding
Yanfeng Chang
Yun-Nung Chen
20
9
0
02 May 2022
Robust Cross-Modal Representation Learning with Progressive
  Self-Distillation
Robust Cross-Modal Representation Learning with Progressive Self-Distillation
A. Andonian
Shixing Chen
Raffay Hamid
VLM
19
55
0
10 Apr 2022
Self-Distillation from the Last Mini-Batch for Consistency
  Regularization
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
13
60
0
30 Mar 2022
Reducing Flipping Errors in Deep Neural Networks
Reducing Flipping Errors in Deep Neural Networks
Xiang Deng
Yun Xiao
Bo Long
Zhongfei Zhang
AAML
21
3
0
16 Mar 2022
Iterative Self Knowledge Distillation -- From Pothole Classification to
  Fine-Grained and COVID Recognition
Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition
Kuan-Chuan Peng
39
2
0
04 Feb 2022
Fortuitous Forgetting in Connectionist Networks
Fortuitous Forgetting in Connectionist Networks
Hattie Zhou
Ankit Vani
Hugo Larochelle
Aaron Courville
CLL
6
42
0
01 Feb 2022
Deep Hash Distillation for Image Retrieval
Deep Hash Distillation for Image Retrieval
Young Kyun Jang
Geonmo Gu
ByungSoo Ko
Isaac Kang
N. Cho
19
34
0
16 Dec 2021
Unsupervised Domain Adaptive Person Re-Identification via Human Learning
  Imitation
Unsupervised Domain Adaptive Person Re-Identification via Human Learning Imitation
Yang Peng
Ping Liu
Yawei Luo
Pan Zhou
Zichuan Xu
Jingen Liu
OOD
16
0
0
28 Nov 2021
MUSE: Feature Self-Distillation with Mutual Information and
  Self-Information
MUSE: Feature Self-Distillation with Mutual Information and Self-Information
Yunpeng Gong
Ye Yu
Gaurav Mittal
Greg Mori
Mei Chen
SSL
17
2
0
25 Oct 2021
Class-Discriminative CNN Compression
Class-Discriminative CNN Compression
Yuchen Liu
D. Wentzlaff
S. Kung
24
1
0
21 Oct 2021
Pro-KD: Progressive Distillation by Following the Footsteps of the
  Teacher
Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher
Mehdi Rezagholizadeh
A. Jafari
Puneeth Salad
Pranav Sharma
Ali Saheb Pasand
A. Ghodsi
71
17
0
16 Oct 2021
A Short Study on Compressing Decoder-Based Language Models
A Short Study on Compressing Decoder-Based Language Models
Tianda Li
Yassir El Mesbahi
I. Kobyzev
Ahmad Rashid
A. Mahmud
Nithin Anchuri
Habib Hajimolahoseini
Yang Liu
Mehdi Rezagholizadeh
84
25
0
16 Oct 2021
Semantic Concentration for Domain Adaptation
Semantic Concentration for Domain Adaptation
Shuang Li
Mixue Xie
Fangrui Lv
Chi Harold Liu
Jian Liang
C. Qin
Wei Li
52
87
0
12 Aug 2021
Feature Mining: A Novel Training Strategy for Convolutional Neural
  Network
Feature Mining: A Novel Training Strategy for Convolutional Neural Network
Tianshu Xie
Xuan Cheng
Xiaomin Wang
Minghui Liu
Jiali Deng
Ming Liu
23
5
0
18 Jul 2021
Distilling EEG Representations via Capsules for Affective Computing
Distilling EEG Representations via Capsules for Affective Computing
Guangyi Zhang
Ali Etemad
22
16
0
30 Apr 2021
Federated Few-Shot Learning with Adversarial Learning
Federated Few-Shot Learning with Adversarial Learning
Chenyou Fan
Jianwei Huang
FedML
13
29
0
01 Apr 2021
Knowledge Evolution in Neural Networks
Knowledge Evolution in Neural Networks
Ahmed Taha
Abhinav Shrivastava
L. Davis
42
21
0
09 Mar 2021
FedADC: Accelerated Federated Learning with Drift Control
FedADC: Accelerated Federated Learning with Drift Control
Emre Ozfatura
Kerem Ozfatura
Deniz Gunduz
FedML
27
37
0
16 Dec 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
Range Loss for Deep Face Recognition with Long-tail
Range Loss for Deep Face Recognition with Long-tail
Xiao Zhang
Zhiyuan Fang
Yandong Wen
Zhifeng Li
Yu Qiao
CVBM
232
446
0
28 Nov 2016
1