Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2204.11526
Cited By
Selective Cross-Task Distillation
25 April 2022
Su Lu
Han-Jia Ye
De-Chuan Zhan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Selective Cross-Task Distillation"
10 / 10 papers shown
Title
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
29
44
0
20 Feb 2022
Improving Neural Cross-Lingual Summarization via Employing Optimal Transport Distance for Knowledge Distillation
Thong Nguyen
A. Luu
50
39
0
07 Dec 2021
Ranking and Tuning Pre-trained Models: A New Paradigm for Exploiting Model Hubs
Kaichao You
Yong Liu
Ziyang Zhang
Jianmin Wang
Michael I. Jordan
Mingsheng Long
98
30
0
20 Oct 2021
MLP-Mixer: An all-MLP Architecture for Vision
Ilya O. Tolstikhin
N. Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
...
Andreas Steiner
Daniel Keysers
Jakob Uszkoreit
Mario Lucic
Alexey Dosovitskiy
239
2,554
0
04 May 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
113
99
0
12 Feb 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
56
140
0
05 Feb 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
Which Model to Transfer? Finding the Needle in the Growing Haystack
Cédric Renggli
André Susano Pinto
Luka Rimanic
J. Puigcerver
C. Riquelme
Ce Zhang
Mario Lucic
21
23
0
13 Oct 2020
Transferability and Hardness of Supervised Classification Tasks
Anh Tran
Cuong V Nguyen
Tal Hassner
134
163
0
21 Aug 2019
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
1