ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.13063
  4. Cited By
Training convolutional neural networks with cheap convolutions and
  online distillation

Training convolutional neural networks with cheap convolutions and online distillation

28 September 2019
Jiao Xie
Shaohui Lin
Yichen Zhang
Linkai Luo
ArXivPDFHTML

Papers citing "Training convolutional neural networks with cheap convolutions and online distillation"

4 / 4 papers shown
Title
Multi scale Feature Extraction and Fusion for Online Knowledge
  Distillation
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
24
3
0
16 Jun 2022
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
275
404
0
09 Apr 2018
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1