Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.13063
Cited By
Training convolutional neural networks with cheap convolutions and online distillation
28 September 2019
Jiao Xie
Shaohui Lin
Yichen Zhang
Linkai Luo
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Training convolutional neural networks with cheap convolutions and online distillation"
4 / 4 papers shown
Title
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
24
3
0
16 Jun 2022
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
275
404
0
09 Apr 2018
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1