Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2104.09124
Cited By
DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
19 April 2021
Yuting Gao
Jia-Xin Zhuang
Xiaowei Guo
Hao Cheng
Xing Sun
Ke Li
Feiyue Huang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning"
28 / 28 papers shown
Title
Efficient Building Roof Type Classification: A Domain-Specific Self-Supervised Approach
Guneet Mutreja
Ksenia Bittner
30
0
0
28 Mar 2025
Simple Unsupervised Knowledge Distillation With Space Similarity
Aditya Singh
Haohan Wang
29
1
0
20 Sep 2024
Lightweight Model Pre-training via Language Guided Knowledge Distillation
Mingsheng Li
Lin Zhang
Mingzhen Zhu
Zilong Huang
Gang Yu
Jiayuan Fan
Tao Chen
34
0
0
17 Jun 2024
ProFeAT: Projected Feature Adversarial Training for Self-Supervised Learning of Robust Representations
Sravanti Addepalli
Priyam Dey
R. Venkatesh Babu
29
0
0
09 Jun 2024
Relational Self-supervised Distillation with Compact Descriptors for Image Copy Detection
Juntae Kim
Sungwon Woo
Jongho Nang
19
1
0
28 May 2024
A review on discriminative self-supervised learning methods
Nikolaos Giakoumoglou
Tania Stathaki
SSL
47
1
0
08 May 2024
On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models
Sean Farhat
Deming Chen
24
0
0
04 Apr 2024
On Good Practices for Task-Specific Distillation of Large Pretrained Visual Models
Juliette Marrie
Michael Arbel
Julien Mairal
Diane Larlus
VLM
MQ
38
1
0
17 Feb 2024
LowDINO -- A Low Parameter Self Supervised Learning Model
Sai Krishna Prathapaneni
Shvejan Shashank
K. SrikarReddy
20
0
0
28 May 2023
Know Your Self-supervised Learning: A Survey on Image-based Generative and Discriminative Training
Utku Ozbulak
Hyun Jung Lee
Beril Boga
Esla Timothy Anzaku
Ho-min Park
Arnout Van Messem
W. D. Neve
J. Vankerschaver
DiffM
24
36
0
23 May 2023
Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning
Kaiyou Song
Jin Xie
Shanyi Zhang
Zimeng Luo
12
29
0
13 Apr 2023
Learning Common Rationale to Improve Self-Supervised Representation for Fine-Grained Visual Recognition Problems
Yangyang Shu
A. Hengel
Lingqiao Liu
SSL
26
14
0
03 Mar 2023
A Simple Recipe for Competitive Low-compute Self supervised Vision Models
Quentin Duval
Ishan Misra
Nicolas Ballas
16
9
0
23 Jan 2023
Establishing a stronger baseline for lightweight contrastive models
Wenye Lin
Yifeng Ding
Zhixiong Cao
Haitao Zheng
17
2
0
14 Dec 2022
Pixel-Wise Contrastive Distillation
Junqiang Huang
Zichao Guo
26
4
0
01 Nov 2022
SDCL: Self-Distillation Contrastive Learning for Chinese Spell Checking
Xiaotian Zhang
Hang Yan
Sun Yu
Xipeng Qiu
14
4
0
31 Oct 2022
Effective Self-supervised Pre-training on Low-compute Networks without Distillation
Fuwen Tan
F. Saleh
Brais Martínez
27
4
0
06 Oct 2022
Slimmable Networks for Contrastive Self-supervised Learning
Shuai Zhao
Xiaohan Wang
Linchao Zhu
Yi Yang
19
1
0
30 Sep 2022
DSPNet: Towards Slimmable Pretrained Networks based on Discriminative Self-supervised Learning
Shaoru Wang
Zeming Li
Jin Gao
Liang Li
Weiming Hu
22
0
0
13 Jul 2022
A Closer Look at Self-Supervised Lightweight Vision Transformers
Shaoru Wang
Jin Gao
Zeming Li
Jian-jun Sun
Weiming Hu
ViT
62
41
0
28 May 2022
PyramidCLIP: Hierarchical Feature Alignment for Vision-language Model Pretraining
Yuting Gao
Jinfeng Liu
Zihan Xu
Jinchao Zhang
Ke Li
Rongrong Ji
Chunhua Shen
VLM
CLIP
17
100
0
29 Apr 2022
Weak Augmentation Guided Relational Self-Supervised Learning
Mingkai Zheng
Shan You
Fei Wang
Chao Qian
Changshui Zhang
Xiaogang Wang
Chang Xu
22
4
0
16 Mar 2022
On the Efficacy of Small Self-Supervised Contrastive Models without Distillation Signals
Haizhou Shi
Youcai Zhang
Siliang Tang
Wenjie Zhu
Yaqian Li
Yandong Guo
Yueting Zhuang
SyDa
11
14
0
30 Jul 2021
Bag of Instances Aggregation Boosts Self-supervised Distillation
Haohang Xu
Jiemin Fang
Xiaopeng Zhang
Lingxi Xie
Xinggang Wang
Wenrui Dai
H. Xiong
Qi Tian
SSL
13
21
0
04 Jul 2021
Self-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-network
Sangmin Bae
Sungnyun Kim
Jongwoo Ko
Gihun Lee
SeungJong Noh
Se-Young Yun
SSL
19
6
0
29 Jun 2021
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
298
5,761
0
29 Apr 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
190
0
12 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
238
3,359
0
09 Mar 2020
1