Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.03263
Cited By
On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models
4 April 2024
Sean Farhat
Deming Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models"
5 / 5 papers shown
Title
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
419
0
19 Apr 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
190
0
12 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
238
3,367
0
09 Mar 2020
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
282
39,170
0
01 Sep 2014
1