Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2003.04289
Cited By
Knowledge distillation via adaptive instance normalization
9 March 2020
Jing Yang
Brais Martínez
Adrian Bulat
Georgios Tzimiropoulos
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge distillation via adaptive instance normalization"
6 / 6 papers shown
Title
FSNet: Redesign Self-Supervised MonoDepth for Full-Scale Depth Prediction for Autonomous Driving
Yuxuan Liu
Zhenhua Xu
Huaiyang Huang
Lujia Wang
Ming-Yu Liu
MDE
38
3
0
21 Apr 2023
CLIP-TD: CLIP Targeted Distillation for Vision-Language Tasks
Zhecan Wang
Noel Codella
Yen-Chun Chen
Luowei Zhou
Jianwei Yang
Xiyang Dai
Bin Xiao
Haoxuan You
Shih-Fu Chang
Lu Yuan
CLIP
VLM
22
39
0
15 Jan 2022
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
16
38
0
20 Mar 2021
Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup
Guodong Xu
Ziwei Liu
Chen Change Loy
UQCV
21
39
0
17 Dec 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,832
0
09 Jun 2020
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1