Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1908.07121
Cited By
Customizing Student Networks From Heterogeneous Teachers via Adaptive Knowledge Amalgamation
20 August 2019
Chengchao Shen
Mengqi Xue
Xinchao Wang
Mingli Song
Li Sun
Xiuming Zhang
MoMe
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Customizing Student Networks From Heterogeneous Teachers via Adaptive Knowledge Amalgamation"
14 / 14 papers shown
Title
Swiss Army Knife: Synergizing Biases in Knowledge from Vision Foundation Models for Multi-Task Learning
Yuxiang Lu
Shengcao Cao
Yu-Xiong Wang
124
1
0
18 Oct 2024
Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures
Hongjun Wu
Li Xiao
Xingkuo Zhang
Yining Miao
105
1
0
28 May 2024
Generative Model-based Feature Knowledge Distillation for Action Recognition
Guiqin Wang
Peng Zhao
Yanjiang Shi
Cong Zhao
Shusen Yang
VLM
72
3
0
14 Dec 2023
Fantastic Gains and Where to Find Them: On the Existence and Prospect of General Knowledge Transfer between Any Pretrained Model
Karsten Roth
Lukas Thede
Almut Sophia Koepke
Oriol Vinyals
Olivier J. Hénaff
Zeynep Akata
AAML
119
13
0
26 Oct 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
191
26
0
19 Jun 2023
Label driven Knowledge Distillation for Federated Learning with non-IID Data
Minh-Duong Nguyen
Quoc-Viet Pham
D. Hoang
Long Tran-Thanh
Diep N. Nguyen
Won Joo Hwang
78
2
0
29 Sep 2022
Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Gongfan Fang
Yifan Bao
Mingli Song
Xinchao Wang
Don Xie
Chengchao Shen
Xiuming Zhang
97
44
0
27 Oct 2021
Meta-Aggregator: Learning to Aggregate for 1-bit Graph Neural Networks
Yongcheng Jing
Yiding Yang
Xinchao Wang
Xiuming Zhang
Dacheng Tao
114
42
0
27 Sep 2021
Training Generative Adversarial Networks in One Stage
Chengchao Shen
Youtan Yin
Xinchao Wang
Xubin Li
Mingli Song
Xiuming Zhang
GAN
111
13
0
28 Feb 2021
Collaborative Teacher-Student Learning via Multiple Knowledge Transfer
Liyuan Sun
Jianping Gou
Baosheng Yu
Lan Du
Dacheng Tao
71
12
0
21 Jan 2021
Progressive Network Grafting for Few-Shot Knowledge Distillation
Chengchao Shen
Xinchao Wang
Youtan Yin
Mingli Song
Sihui Luo
Xiuming Zhang
75
49
0
09 Dec 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
291
3,026
0
09 Jun 2020
Distilling Knowledge from Graph Convolutional Networks
Yiding Yang
Jiayan Qiu
Xiuming Zhang
Dacheng Tao
Xinchao Wang
239
233
0
23 Mar 2020
Data-Free Adversarial Distillation
Gongfan Fang
Mingli Song
Chengchao Shen
Xinchao Wang
Da Chen
Xiuming Zhang
82
148
0
23 Dec 2019
1