Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1708.04106
Cited By
Rocket Launching: A Universal and Efficient Framework for Training Well-performing Light Net
14 August 2017
Guorui Zhou
Ying Fan
Runpeng Cui
Weijie Bian
Xiaoqiang Zhu
Kun Gai
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Rocket Launching: A Universal and Efficient Framework for Training Well-performing Light Net"
13 / 13 papers shown
Title
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
51
2
0
15 Mar 2023
Audio Representation Learning by Distilling Video as Privileged Information
Amirhossein Hajavi
Ali Etemad
13
4
0
06 Feb 2023
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
22
12
0
28 Jan 2023
Directed Acyclic Graph Factorization Machines for CTR Prediction via Knowledge Distillation
Zhen Tian
Ting Bai
Ziyan Zhang
Zhiyuan Xu
Kangyi Lin
Ji-Rong Wen
Wayne Xin Zhao
21
18
0
21 Nov 2022
Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation
Rahul Mishra
Hari Prabhat Gupta
27
8
0
30 Sep 2022
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
24
3
0
16 Jun 2022
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
29
7
0
13 May 2022
2D Human Pose Estimation: A Survey
Haoming Chen
Runyang Feng
Sifan Wu
Hao Xu
F. Zhou
Zhenguang Liu
3DH
25
55
0
15 Apr 2022
Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation
Zhiwei Hao
Jianyuan Guo
Ding Jia
Kai Han
Yehui Tang
Chao Zhang
Dacheng Tao
Yunhe Wang
ViT
33
68
0
03 Jul 2021
Privileged Graph Distillation for Cold Start Recommendation
Shuai Wang
Kun Zhang
Le Wu
Haiping Ma
Richang Hong
Meng Wang
10
28
0
31 May 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
Deep Interest Evolution Network for Click-Through Rate Prediction
Guorui Zhou
Na Mou
Ying Fan
Qi Pi
Weijie Bian
Chang Zhou
Xiaoqiang Zhu
Kun Gai
11
1,042
0
11 Sep 2018
1