Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.12072
Cited By
How and When Adversarial Robustness Transfers in Knowledge Distillation?
22 October 2021
Rulin Shao
Ming Zhou
C. Bezemer
Cho-Jui Hsieh
AAML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"How and When Adversarial Robustness Transfers in Knowledge Distillation?"
3 / 3 papers shown
Title
Releasing Inequality Phenomena in
L
∞
L_{\infty}
L
∞
-Adversarial Training via Input Gradient Distillation
Junxi Chen
Junhao Dong
Xiaohua Xie
AAML
16
0
0
16 May 2023
Maximum Likelihood Distillation for Robust Modulation Classification
Javier Maroto
Gérôme Bovet
P. Frossard
AAML
13
5
0
01 Nov 2022
Intriguing Properties of Vision Transformers
Muzammal Naseer
Kanchana Ranasinghe
Salman Khan
Munawar Hayat
F. Khan
Ming-Hsuan Yang
ViT
256
620
0
21 May 2021
1