ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.09058
  4. Cited By
Feature Fusion for Online Mutual Knowledge Distillation

Feature Fusion for Online Mutual Knowledge Distillation

19 April 2019
Jangho Kim
Minsung Hyun
Inseop Chung
Nojun Kwak
    FedML
ArXivPDFHTML

Papers citing "Feature Fusion for Online Mutual Knowledge Distillation"

16 / 16 papers shown
Title
Indirect Gradient Matching for Adversarial Robust Distillation
Indirect Gradient Matching for Adversarial Robust Distillation
Hongsin Lee
Seungju Cho
Changick Kim
AAML
FedML
48
2
0
06 Dec 2023
Multi-View Fusion and Distillation for Subgrade Distresses Detection
  based on 3D-GPR
Multi-View Fusion and Distillation for Subgrade Distresses Detection based on 3D-GPR
Chunpeng Zhou
Kang Ning
Haishuai Wang
Zhi Yu
Sheng Zhou
Jiajun Bu
16
1
0
09 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Generalization Matters: Loss Minima Flattening via Parameter
  Hybridization for Efficient Online Knowledge Distillation
Generalization Matters: Loss Minima Flattening via Parameter Hybridization for Efficient Online Knowledge Distillation
Tianli Zhang
Mengqi Xue
Jiangtao Zhang
Haofei Zhang
Yu Wang
Lechao Cheng
Jie Song
Mingli Song
28
5
0
26 Mar 2023
Teacher-Student Architecture for Knowledge Learning: A Survey
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
QTI Submission to DCASE 2021: residual normalization for
  device-imbalanced acoustic scene classification with efficient design
QTI Submission to DCASE 2021: residual normalization for device-imbalanced acoustic scene classification with efficient design
Byeonggeun Kim
Seunghan Yang
Jangho Kim
Simyung Chang
28
56
0
28 Jun 2022
Multi scale Feature Extraction and Fusion for Online Knowledge
  Distillation
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
19
3
0
16 Jun 2022
HFT: Lifting Perspective Representations via Hybrid Feature
  Transformation
HFT: Lifting Perspective Representations via Hybrid Feature Transformation
Jiayu Zou
Jun Xiao
Zheng Hua Zhu
Junjie Huang
Guan Huang
Dalong Du
Xingang Wang
29
18
0
11 Apr 2022
Efficient training of lightweight neural networks using Online
  Self-Acquired Knowledge Distillation
Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation
Maria Tzelepi
Anastasios Tefas
11
6
0
26 Aug 2021
PQK: Model Compression via Pruning, Quantization, and Knowledge
  Distillation
PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation
Jang-Hyun Kim
Simyung Chang
Nojun Kwak
22
44
0
25 Jun 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
25
46
0
26 Mar 2021
Prototype-based Personalized Pruning
Prototype-based Personalized Pruning
Jang-Hyun Kim
Simyung Chang
Sungrack Yun
Nojun Kwak
8
4
0
25 Mar 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
QKD: Quantization-aware Knowledge Distillation
QKD: Quantization-aware Knowledge Distillation
Jangho Kim
Yash Bhalgat
Jinwon Lee
Chirag I. Patel
Nojun Kwak
MQ
16
63
0
28 Nov 2019
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
269
404
0
09 Apr 2018
1