ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.00548
  4. Cited By
Unified and Effective Ensemble Knowledge Distillation

Unified and Effective Ensemble Knowledge Distillation

1 April 2022
Chuhan Wu
Fangzhao Wu
Tao Qi
Yongfeng Huang
    FedML
ArXiv (abs)PDFHTML

Papers citing "Unified and Effective Ensemble Knowledge Distillation"

4 / 4 papers shown
MIND: Modality-Informed Knowledge Distillation Framework for Multimodal Clinical Prediction Tasks
MIND: Modality-Informed Knowledge Distillation Framework for Multimodal Clinical Prediction Tasks
Alejandro Guerra-Manzanares
Farah E. Shamout
333
3
0
03 Feb 2025
GOVERN: Gradient Orientation Vote Ensemble for Multi-Teacher Reinforced
  Distillation
GOVERN: Gradient Orientation Vote Ensemble for Multi-Teacher Reinforced DistillationConference on Empirical Methods in Natural Language Processing (EMNLP), 2024
Wenjie Zhou
Zhenxin Ding
Xiaodong Zhang
Haibo Shi
Junfeng Wang
D. Yin
170
1
0
06 May 2024
Ensemble knowledge distillation of self-supervised speech models
Ensemble knowledge distillation of self-supervised speech modelsIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2023
Kuan-Po Huang
Tzu-hsun Feng
Yu-Kuan Fu
Tsung-Yuan Hsu
Po-Chieh Yen
Wei-Cheng Tseng
Kai-Wei Chang
Hung-yi Lee
275
21
0
24 Feb 2023
PROD: Progressive Distillation for Dense Retrieval
PROD: Progressive Distillation for Dense RetrievalThe Web Conference (WWW), 2022
Zhenghao Lin
Yeyun Gong
Xiao Liu
Hang Zhang
Chen Lin
...
Jian Jiao
Jing Lu
Daxin Jiang
Rangan Majumder
Nan Duan
343
30
0
27 Sep 2022
1