Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1706.00384
Cited By
Deep Mutual Learning
1 June 2017
Ying Zhang
Tao Xiang
Timothy M. Hospedales
Huchuan Lu
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Deep Mutual Learning"
50 / 209 papers shown
Title
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
51
0
0
27 Apr 2025
Causality Enhanced Origin-Destination Flow Prediction in Data-Scarce Cities
Tao Feng
Yunke Zhang
Huandong Wang
Yong Li
147
0
0
09 Mar 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
40
0
0
06 Jan 2025
GazeGen: Gaze-Driven User Interaction for Visual Content Generation
He-Yen Hsieh
Ziyun Li
Sai Qian Zhang
W. Ting
Kao-Den Chang
B. D. Salvo
Chiao Liu
H. T. Kung
VGen
35
0
0
07 Nov 2024
Multiple Information Prompt Learning for Cloth-Changing Person Re-Identification
Shengxun Wei
Zan Gao
Yibo Zhao
Weili Guan
Weili Guan
Shengyong Chen
46
1
0
01 Nov 2024
Browsing without Third-Party Cookies: What Do You See?
Maxwell Lin
Shihan Lin
Helen Wu
Karen Wang
Xiaowei Yang
BDL
51
8
0
14 Oct 2024
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
36
0
0
30 Sep 2024
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Chaomin Shen
Faming Fang
Guixu Zhang
29
0
0
27 Sep 2024
Online Multi-level Contrastive Representation Distillation for Cross-Subject fNIRS Emotion Recognition
Zhili Lai
Chunmei Qing
Junpeng Tan
Wanxiang Luo
Xiangmin Xu
21
1
0
24 Sep 2024
Deep Companion Learning: Enhancing Generalization Through Historical Consistency
Ruizhao Zhu
Venkatesh Saligrama
FedML
29
0
0
26 Jul 2024
Continual Learning with Diffusion-based Generative Replay for Industrial Streaming Data
Jiayi He
Jiao Chen
Qianmiao Liu
Suyan Dai
Jianhua Tang
Dongpo Liu
DiffM
AI4CE
24
4
0
22 Jun 2024
DistilDoc: Knowledge Distillation for Visually-Rich Document Applications
Jordy Van Landeghem
Subhajit Maity
Ayan Banerjee
Matthew Blaschko
Marie-Francine Moens
Josep Lladós
Sanket Biswas
43
2
0
12 Jun 2024
Overcoming Data and Model Heterogeneities in Decentralized Federated Learning via Synthetic Anchors
Chun-Yin Huang
Kartik Srinivas
Xin Zhang
Xiaoxiao Li
DD
48
6
0
19 May 2024
The Curse of Diversity in Ensemble-Based Exploration
Zhixuan Lin
P. DÓro
Evgenii Nikishin
Aaron C. Courville
40
1
0
07 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
56
1
0
22 Apr 2024
Practical Insights into Knowledge Distillation for Pre-Trained Models
Norah Alballa
Marco Canini
37
2
0
22 Feb 2024
Predictive Churn with the Set of Good Models
J. Watson-Daniels
Flavio du Pin Calmon
Alexander DÁmour
Carol Xuan Long
David C. Parkes
Berk Ustun
81
7
0
12 Feb 2024
P2Seg: Pointly-supervised Segmentation via Mutual Distillation
Zipeng Wang
Xuehui Yu
Xumeng Han
Wenwen Yu
Zhixun Huang
Jianbin Jiao
Zhenjun Han
23
0
0
18 Jan 2024
Mutual Distillation Learning For Person Re-Identification
Huiyuan Fu
Kuilong Cui
Chuanming Wang
Mengshi Qi
Huadong Ma
28
0
0
12 Jan 2024
Semi-supervised learning via DQN for log anomaly detection
Yingying He
Xiaobing Pei
Lihong Shen
28
1
0
06 Jan 2024
Decoupled Knowledge with Ensemble Learning for Online Distillation
Baitan Shao
Ying Chen
18
0
0
18 Dec 2023
Cooperative Learning for Cost-Adaptive Inference
Xingli Fang
Richard M. Bradford
Jung-Eun Kim
26
1
0
13 Dec 2023
Choosing Wisely and Learning Deeply: Selective Cross-Modality Distillation via CLIP for Domain Generalization
Jixuan Leng
Yijiang Li
Haohan Wang
VLM
31
0
0
26 Nov 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
26
0
0
26 Oct 2023
Expression Syntax Information Bottleneck for Math Word Problems
Jing Xiong
Chengming Li
Min Yang
Xiping Hu
Bin Hu
22
5
0
24 Oct 2023
Dual Compensation Residual Networks for Class Imbalanced Learning
Rui Hou
Hong Chang
Bingpeng Ma
Shiguang Shan
Xilin Chen
18
5
0
25 Aug 2023
Multi-Label Knowledge Distillation
Penghui Yang
Ming-Kun Xie
Chen-Chen Zong
Lei Feng
Gang Niu
Masashi Sugiyama
Sheng-Jun Huang
30
10
0
12 Aug 2023
Multi-View Fusion and Distillation for Subgrade Distresses Detection based on 3D-GPR
Chunpeng Zhou
Kang Ning
Haishuai Wang
Zhi Yu
Sheng Zhou
Jiajun Bu
16
1
0
09 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
SwinMM: Masked Multi-view with Swin Transformers for 3D Medical Image Segmentation
Yiqing Wang
Zihan Li
Jieru Mei
Zi-Ying Wei
Li Liu
Chen Wang
Shengtian Sang
Alan Yuille
Cihang Xie
Yuyin Zhou
ViT
MedIm
20
31
0
24 Jul 2023
Frameless Graph Knowledge Distillation
Dai Shi
Zhiqi Shao
Yi Guo
Junbin Gao
32
4
0
13 Jul 2023
CrossKD: Cross-Head Knowledge Distillation for Object Detection
Jiabao Wang
Yuming Chen
Zhaohui Zheng
Xiang Li
Ming-Ming Cheng
Qibin Hou
40
32
0
20 Jun 2023
Towards Higher Pareto Frontier in Multilingual Machine Translation
Yi-Chong Huang
Xiaocheng Feng
Xinwei Geng
Baohang Li
Bing Qin
33
9
0
25 May 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
29
19
0
22 May 2023
Self-Distillation with Meta Learning for Knowledge Graph Completion
Yunshui Li
Junhao Liu
Chengming Li
Min Yang
21
5
0
20 May 2023
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
18
17
0
18 May 2023
GeNAS: Neural Architecture Search with Better Generalization
Joonhyun Jeong
Joonsang Yu
Geondo Park
Dongyoon Han
Y. Yoo
25
4
0
15 May 2023
EAML: Ensemble Self-Attention-based Mutual Learning Network for Document Image Classification
Souhail Bakkali
Zuheng Ming
Mickael Coustaty
Marçal Rusiñol
8
6
0
11 May 2023
CORSD: Class-Oriented Relational Self Distillation
Muzhou Yu
S. Tan
Kailu Wu
Runpei Dong
Linfeng Zhang
Kaisheng Ma
24
0
0
28 Apr 2023
Self-discipline on multiple channels
Jiutian Zhao
Liangchen Luo
Hao Wang
19
0
0
27 Apr 2023
Function-Consistent Feature Distillation
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
44
18
0
24 Apr 2023
Grouped Knowledge Distillation for Deep Face Recognition
Weisong Zhao
Xiangyu Zhu
Kaiwen Guo
Xiaoyu Zhang
Zhen Lei
CVBM
15
6
0
10 Apr 2023
Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation
Tianli Zhang
Mengqi Xue
Jiangtao Zhang
Haofei Zhang
Yu Wang
Lechao Cheng
Jie Song
Mingli Song
28
5
0
26 Mar 2023
Focus on Your Target: A Dual Teacher-Student Framework for Domain-adaptive Semantic Segmentation
Xinyue Huo
Lingxi Xie
Wen-gang Zhou
Houqiang Li
Qi Tian
26
8
0
16 Mar 2023
CoT-MISR:Marrying Convolution and Transformer for Multi-Image Super-Resolution
Mingming Xiu
Yang Nie
Qing-Huang Song
Chun Liu
SupR
ViT
24
1
0
12 Mar 2023
Smooth and Stepwise Self-Distillation for Object Detection
Jieren Deng
Xiaoxia Zhou
Hao Tian
Zhihong Pan
Derek Aguiar
ObjD
23
0
0
09 Mar 2023
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
29
8
0
27 Feb 2023
Distilling Calibrated Student from an Uncalibrated Teacher
Ishan Mishra
Sethu Vamsi Krishna
Deepak Mishra
FedML
32
2
0
22 Feb 2023
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
22
12
0
28 Jan 2023
1
2
3
4
5
Next