Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.11518
Cited By
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
23 July 2022
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition"
22 / 22 papers shown
Title
MuTri: Multi-view Tri-alignment for OCT to OCTA 3D Image Translation
Z. Chen
Hualiang Wang
Chubin Ou
Xiaomeng Li
28
0
0
02 Apr 2025
Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence
Zhaowei Chen
Borui Zhao
Yuchen Ge
Yuhao Chen
Renjie Song
Jiajun Liang
42
0
0
09 Mar 2025
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Chuanguang Yang
Xinqiang Yu
Han Yang
Zhulin An
Chengqing Yu
Libo Huang
Y. Xu
28
0
0
22 Feb 2025
ECG-guided individual identification via PPG
Riling Wei
Hanjie Chen
Kelu Yao
Chuanguang Yang
Jun Wang
Chao Li
26
0
0
30 Dec 2024
MPQ-DM: Mixed Precision Quantization for Extremely Low Bit Diffusion Models
Weilun Feng
Haotong Qin
Chuanguang Yang
Zhulin An
Libo Huang
Boyu Diao
Fei Wang
Renshuai Tao
Y. Xu
Michele Magno
DiffM
MQ
75
4
0
16 Dec 2024
Prototype-Driven Multi-Feature Generation for Visible-Infrared Person Re-identification
Jiarui Li
Zhen Qiu
Yilin Yang
Yuqi Li
Zeyu Dong
Chuanguang Yang
27
0
0
09 Sep 2024
CNN-Transformer Rectified Collaborative Learning for Medical Image Segmentation
Lanhu Wu
Miao Zhang
Yongri Piao
Zhenyan Yao
Weibing Sun
Feng Tian
Huchuan Lu
ViT
MedIm
21
1
0
25 Aug 2024
Adaptive Modality Balanced Online Knowledge Distillation for Brain-Eye-Computer based Dim Object Detection
Zixing Li
Chao Yan
Zhen Lan
Xiaojia Xiang
Han Zhou
Jun Lai
Dengqing Tang
23
0
0
02 Jul 2024
Online Policy Distillation with Decision-Attention
Xinqiang Yu
Chuanguang Yang
Chengqing Yu
Libo Huang
Zhulin An
Yongjun Xu
OffRL
31
0
0
08 Jun 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
53
1
0
22 Apr 2024
A Comprehensive Review of Knowledge Distillation in Computer Vision
Sheikh Musa Kaleem
Tufail Rouf
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
VLM
19
12
0
01 Apr 2024
Data-Driven Estimation of the False Positive Rate of the Bayes Binary Classifier via Soft Labels
Minoh Jeong
Martina Cardone
Alex Dytso
13
0
0
27 Jan 2024
CLIP-KD: An Empirical Study of CLIP Model Distillation
Chuanguang Yang
Zhulin An
Libo Huang
Junyu Bi
Xinqiang Yu
Hansheng Yang
Boyu Diao
Yongjun Xu
VLM
16
25
0
24 Jul 2023
Team AcieLee: Technical Report for EPIC-SOUNDS Audio-Based Interaction Recognition Challenge 2023
Yuqi Li
Yi-Jhen Luo
Xiaoshuai Hao
Chuanguang Yang
Zhulin An
Dantong Song
Wei Yi
14
0
0
15 Jun 2023
A Survey of Historical Learning: Learning Models with Learning History
Xiang Li
Ge Wu
Lingfeng Yang
Wenzhe Wang
Renjie Song
Jian Yang
MU
AI4TS
17
2
0
23 Mar 2023
Efficient Masked Autoencoders with Self-Consistency
Zhaowen Li
Yousong Zhu
Zhiyang Chen
Wei Li
Chaoyang Zhao
Rui Zhao
Ming Tang
Jinqiao Wang
37
2
0
28 Feb 2023
Unifying Synergies between Self-supervised Learning and Dynamic Computation
Tarun Krishna
Ayush Rai
Alexandru Drimbarean
Eric Arazo
Paul Albert
A. Smeaton
Kevin McGuinness
Noel E. O'Connor
11
0
0
22 Jan 2023
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
94
74
0
26 Apr 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
Forward and Reverse Gradient-Based Hyperparameter Optimization
Luca Franceschi
Michele Donini
P. Frasconi
Massimiliano Pontil
112
404
0
06 Mar 2017
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
244
1,279
0
06 Mar 2017
1