Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.10893
Cited By
Student-friendly Knowledge Distillation
18 May 2023
Mengyang Yuan
Bo Lang
Fengnan Quan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Student-friendly Knowledge Distillation"
7 / 7 papers shown
Title
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
25
0
0
08 Mar 2024
AST: Effective Dataset Distillation through Alignment with Smooth and High-Quality Expert Trajectories
Jiyuan Shen
Wenzhuo Yang
Kwok-Yan Lam
DD
20
1
0
16 Oct 2023
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-man Cheung
75
41
0
29 Jun 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
56
140
0
05 Feb 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
279
39,083
0
01 Sep 2014
1