Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.06701
Cited By
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
13 May 2022
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Meets Open-Set Semi-Supervised Learning"
7 / 7 papers shown
Title
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via
D
\mathbf{\texttt{D}}
D
ual-
H
\mathbf{\texttt{H}}
H
ead
O
\mathbf{\texttt{O}}
O
ptimization
Seongjae Kang
Dong Bok Lee
Hyungjoon Jang
Sung Ju Hwang
VLM
35
0
0
12 May 2025
Towards Realistic Semi-Supervised Learning
Mamshad Nayeem Rizve
Navid Kardan
M. Shah
37
44
0
05 Jul 2022
Trash to Treasure: Harvesting OOD Data with Cross-Modal Matching for Open-Set Semi-Supervised Learning
Junkai Huang
Chaowei Fang
Weikai Chen
Z. Chai
Xiaolin K. Wei
Pengxu Wei
Liang Lin
Guanbin Li
OODD
46
63
0
12 Aug 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
187
472
0
12 Jun 2018
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,290
0
05 Nov 2016
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
279
39,083
0
01 Sep 2014
1