Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2107.01691
Cited By
Bag of Instances Aggregation Boosts Self-supervised Distillation
4 July 2021
Haohang Xu
Jiemin Fang
Xiaopeng Zhang
Lingxi Xie
Xinggang Wang
Wenrui Dai
H. Xiong
Qi Tian
SSL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Bag of Instances Aggregation Boosts Self-supervised Distillation"
7 / 7 papers shown
Title
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via
D
\mathbf{\texttt{D}}
D
ual-
H
\mathbf{\texttt{H}}
H
ead
O
\mathbf{\texttt{O}}
O
ptimization
Seongjae Kang
Dong Bok Lee
Hyungjoon Jang
Sung Ju Hwang
VLM
41
0
0
12 May 2025
Retro: Reusing teacher projection head for efficient embedding distillation on Lightweight Models via Self-supervised Learning
Khanh-Binh Nguyen
Chae Jung Park
32
0
0
24 May 2024
Pixel-Wise Contrastive Distillation
Junqiang Huang
Zichao Guo
37
4
0
01 Nov 2022
Towards Sustainable Self-supervised Learning
Shanghua Gao
Pan Zhou
Mingg-Ming Cheng
Shuicheng Yan
CLL
33
7
0
20 Oct 2022
With a Little Help from My Friends: Nearest-Neighbor Contrastive Learning of Visual Representations
Debidatta Dwibedi
Y. Aytar
Jonathan Tompson
P. Sermanet
Andrew Zisserman
SSL
183
450
0
29 Apr 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
190
0
12 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
238
3,367
0
09 Mar 2020
1