ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.13130
  4. Cited By
Revisiting Architecture-aware Knowledge Distillation: Smaller Models and
  Faster Search

Revisiting Architecture-aware Knowledge Distillation: Smaller Models and Faster Search

27 June 2022
Taehyeon Kim
Heesoo Myeong
Se-Young Yun
ArXivPDFHTML

Papers citing "Revisiting Architecture-aware Knowledge Distillation: Smaller Models and Faster Search"

4 / 4 papers shown
Title
MLP-Mixer: An all-MLP Architecture for Vision
MLP-Mixer: An all-MLP Architecture for Vision
Ilya O. Tolstikhin
N. Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
...
Andreas Steiner
Daniel Keysers
Jakob Uszkoreit
Mario Lucic
Alexey Dosovitskiy
271
2,603
0
04 May 2021
Distilling Optimal Neural Networks: Rapid Search in Diverse Spaces
Distilling Optimal Neural Networks: Rapid Search in Diverse Spaces
Bert Moons
Parham Noorzad
Andrii Skliar
G. Mariani
Dushyant Mehta
Chris Lott
Tijmen Blankevoort
137
43
0
16 Dec 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,561
0
17 Apr 2017
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1