Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.00573
Cited By
Multi-level Knowledge Distillation via Knowledge Alignment and Correlation
1 December 2020
Fei Ding
Yin Yang
Hongxin Hu
V. Krovi
Feng Luo
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-level Knowledge Distillation via Knowledge Alignment and Correlation"
4 / 4 papers shown
Title
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
314
5,775
0
29 Apr 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
236
190
0
12 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
267
3,369
0
09 Mar 2020
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
261
1,275
0
06 Mar 2017
1