Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2207.12106
Cited By
Black-box Few-shot Knowledge Distillation
25 July 2022
Dang Nguyen
Sunil R. Gupta
Kien Do
Svetha Venkatesh
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Black-box Few-shot Knowledge Distillation"
6 / 6 papers shown
Title
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via
D
\mathbf{\texttt{D}}
D
ual-
H
\mathbf{\texttt{H}}
H
ead
O
\mathbf{\texttt{O}}
O
ptimization
Seongjae Kang
Dong Bok Lee
Hyungjoon Jang
Sung Ju Hwang
VLM
57
0
0
12 May 2025
HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification
Omar S. El-Assiouti
Ghada Hamed
Dina Khattab
H. M. Ebied
37
1
0
10 Jul 2024
Can't Hide Behind the API: Stealing Black-Box Commercial Embedding Models
Manveer Singh Tamber
Jasper Xian
Jimmy Lin
MLAU
SILM
154
0
0
13 Jun 2024
Tiny models from tiny data: Textual and null-text inversion for few-shot distillation
Erik Landolsi
Fredrik Kahl
DiffM
58
1
0
05 Jun 2024
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy
Jingru Li
Sheng Zhou
Liangcheng Li
Haishuai Wang
Zhi Yu
Jiajun Bu
28
14
0
29 Aug 2022
High Dimensional Level Set Estimation with Bayesian Neural Network
Huong Ha
Sunil R. Gupta
Santu Rana
Svetha Venkatesh
25
9
0
17 Dec 2020
1