Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.10754
Cited By
FEED: Feature-level Ensemble for Knowledge Distillation
24 September 2019
Seonguk Park
Nojun Kwak
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FEED: Feature-level Ensemble for Knowledge Distillation"
10 / 10 papers shown
Title
Self-discipline on multiple channels
Jiutian Zhao
Liangchen Luo
Hao Wang
32
0
0
27 Apr 2023
Federated Learning with Privacy-Preserving Ensemble Attention Distillation
Xuan Gong
Liangchen Song
Rishi Vedula
Abhishek Sharma
Meng Zheng
...
Arun Innanje
Terrence Chen
Junsong Yuan
David Doermann
Ziyan Wu
FedML
23
27
0
16 Oct 2022
Integrating Object-aware and Interaction-aware Knowledge for Weakly Supervised Scene Graph Generation
Xingchen Li
Long Chen
Wenbo Ma
Yi Yang
Jun Xiao
21
26
0
03 Aug 2022
EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation
Lin Wang
Yujeong Chae
Sung-Hoon Yoon
Tae-Kyun Kim
Kuk-Jin Yoon
42
64
0
24 Nov 2021
KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks
Rishabh Bhardwaj
Tushar Vaidya
Soujanya Poria
OT
FedML
65
7
0
06 Oct 2021
Dual Transfer Learning for Event-based End-task Prediction via Pluggable Event to Image Translation
Lin Wang
Yujeong Chae
Kuk-Jin Yoon
27
32
0
04 Sep 2021
There is More than Meets the Eye: Self-Supervised Multi-Object Detection and Tracking with Sound by Distilling Multimodal Knowledge
Francisco Rivera Valverde
Juana Valeria Hurtado
Abhinav Valada
26
72
0
01 Mar 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
209
473
0
12 Jun 2018
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
270
1,275
0
06 Mar 2017
1