ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.09866
  4. Cited By
Distill on the Go: Online knowledge distillation in self-supervised
  learning

Distill on the Go: Online knowledge distillation in self-supervised learning

20 April 2021
Prashant Bhat
Elahe Arani
Bahram Zonooz
    SSL
ArXivPDFHTML

Papers citing "Distill on the Go: Online knowledge distillation in self-supervised learning"

9 / 9 papers shown
Title
Pixel-Wise Contrastive Distillation
Pixel-Wise Contrastive Distillation
Junqiang Huang
Zichao Guo
37
4
0
01 Nov 2022
Effective Self-supervised Pre-training on Low-compute Networks without
  Distillation
Effective Self-supervised Pre-training on Low-compute Networks without Distillation
Fuwen Tan
F. Saleh
Brais Martínez
27
4
0
06 Oct 2022
FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
Sungwon Han
Sungwon Park
Fangzhao Wu
Sundong Kim
Chuhan Wu
Xing Xie
M. Cha
FedML
16
53
0
19 Jul 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
21
46
0
28 May 2022
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
On the Efficacy of Small Self-Supervised Contrastive Models without
  Distillation Signals
On the Efficacy of Small Self-Supervised Contrastive Models without Distillation Signals
Haizhou Shi
Youcai Zhang
Siliang Tang
Wenjie Zhu
Yaqian Li
Yandong Guo
Yueting Zhuang
SyDa
21
14
0
30 Jul 2021
SEED: Self-supervised Distillation For Visual Representation
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
190
0
12 Jan 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
473
0
12 Jun 2018
Pixel Recurrent Neural Networks
Pixel Recurrent Neural Networks
Aaron van den Oord
Nal Kalchbrenner
Koray Kavukcuoglu
SSeg
GAN
225
2,543
0
25 Jan 2016
1