Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2204.05547
Cited By
DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization
12 April 2022
XueQing Deng
Dawei Sun
Shawn D. Newsam
Peng Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization"
6 / 6 papers shown
Title
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Joint-DetNAS: Upgrade Your Detector with NAS, Pruning and Dynamic Distillation
Lewei Yao
Renjie Pi
Hang Xu
Wei Zhang
Zhenguo Li
Tong Zhang
83
38
0
27 May 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
149
420
0
19 Apr 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
65
143
0
05 Feb 2021
Bilevel Programming for Hyperparameter Optimization and Meta-Learning
Luca Franceschi
P. Frasconi
Saverio Salzo
Riccardo Grazzi
Massimiliano Pontil
104
716
0
13 Jun 2018
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,561
0
17 Apr 2017
1