Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.08097
Cited By
Ensemble Knowledge Distillation for Learning Improved and Efficient Networks
17 September 2019
Umar Asif
Jianbin Tang
S. Harrer
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Ensemble Knowledge Distillation for Learning Improved and Efficient Networks"
11 / 11 papers shown
Title
Corrected with the Latest Version: Make Robust Asynchronous Federated Learning Possible
Chaoyi Lu
Yiding Sun
Pengbo Li
Zhichuan Yang
FedML
34
0
0
05 Apr 2025
PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation
Mike Ranzinger
Jon Barker
Greg Heinrich
Pavlo Molchanov
Bryan Catanzaro
Andrew Tao
35
4
0
02 Oct 2024
AM-RADIO: Agglomerative Vision Foundation Model -- Reduce All Domains Into One
Michael Ranzinger
Greg Heinrich
Jan Kautz
Pavlo Molchanov
VLM
31
42
0
10 Dec 2023
FSNet: Redesign Self-Supervised MonoDepth for Full-Scale Depth Prediction for Autonomous Driving
Yuxuan Liu
Zhenhua Xu
Huaiyang Huang
Lujia Wang
Ming-Yu Liu
MDE
38
3
0
21 Apr 2023
Knowledge Distillation for Efficient Sequences of Training Runs
Xingyu Liu
A. Leonardi
Lu Yu
Chris Gilmer-Hill
Matthew L. Leavitt
Jonathan Frankle
11
4
0
11 Mar 2023
End-to-end Ensemble-based Feature Selection for Paralinguistics Tasks
Tamás Grósz
Mittul Singh
Sudarsana Reddy Kadiri
H. Kathania
M. Kurimo
18
0
0
28 Oct 2022
Federated Learning with Privacy-Preserving Ensemble Attention Distillation
Xuan Gong
Liangchen Song
Rishi Vedula
Abhishek Sharma
Meng Zheng
...
Arun Innanje
Terrence Chen
Junsong Yuan
David Doermann
Ziyan Wu
FedML
15
27
0
16 Oct 2022
Label driven Knowledge Distillation for Federated Learning with non-IID Data
Minh-Duong Nguyen
Viet Quoc Pham
D. Hoang
Long Tran-Thanh
Diep N. Nguyen
W. Hwang
16
2
0
29 Sep 2022
Enhancing Heterogeneous Federated Learning with Knowledge Extraction and Multi-Model Fusion
Duy Phuong Nguyen
Sixing Yu
J. P. Muñoz
Ali Jannesari
FedML
19
12
0
16 Aug 2022
A Comprehensive Survey on Hardware-Aware Neural Architecture Search
Hadjer Benmeziane
K. E. Maghraoui
Hamza Ouarnoughi
Smail Niar
Martin Wistuba
Naigang Wang
26
95
0
22 Jan 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
1