ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.06634
  4. Cited By
Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning

Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning

IEEE International Conference on Multimedia and Expo (ICME), 2023
11 June 2023
Hailin Zhang
Defang Chen
Can Wang
ArXiv (abs)PDFHTMLGithub (25★)

Papers citing "Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning"

10 / 10 papers shown
Title
Knowledge Distillation of Uncertainty using Deep Latent Factor Model
Knowledge Distillation of Uncertainty using Deep Latent Factor Model
Sehyun Park
Jongjin Lee
Yunseop Shin
Ilsang Ohn
Yongdai Kim
UQCVBDL
329
0
0
22 Oct 2025
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Philippe Formont
Maxime Darrin
Banafsheh Karimian
Jackie Chi Kit Cheung
Eric Granger
Ismail Ben Ayed
Mohammadhadi Shateri
Pablo Piantanida
132
0
0
21 Oct 2025
AMMKD: Adaptive Multimodal Multi-teacher Distillation for Lightweight Vision-Language Models
AMMKD: Adaptive Multimodal Multi-teacher Distillation for Lightweight Vision-Language Models
Yuqi Li
Chuanguang Yang
Junhao Dong
Zhengtao Yao
Haoyan Xu
Zeyu Dong
Hansheng Zeng
Zhulin An
Yingli Tian
VLM
85
5
0
23 Aug 2025
Distilled-3DGS:Distilled 3D Gaussian Splatting
Distilled-3DGS:Distilled 3D Gaussian Splatting
Lintao Xiang
Xinkai Chen
Jianhuang Lai
Guangcong Wang
3DGS
101
0
0
19 Aug 2025
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
DeepKD: A Deeply Decoupled and Denoised Knowledge Distillation Trainer
Haiduo Huang
Jiangcheng Song
Yadong Zhang
Pengju Ren
271
0
0
21 May 2025
DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images
DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images
Sadman Sakib Alif
Nasim Anzum Promise
Fiaz Al Abid
Aniqua Nusrat Zereen
209
1
0
14 May 2025
Image Recognition with Online Lightweight Vision Transformer: A Survey
Image Recognition with Online Lightweight Vision Transformer: A Survey
Zherui Zhang
Rongtao Xu
Jie Zhou
Changwei Wang
Xingtian Pei
...
Jiguang Zhang
Li Guo
Longxiang Gao
Wenyuan Xu
Shibiao Xu
ViT
1.1K
2
0
06 May 2025
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual Recognition
Multi-Teacher Knowledge Distillation with Reinforcement Learning for Visual RecognitionAAAI Conference on Artificial Intelligence (AAAI), 2025
Chuanguang Yang
Xinqiang Yu
Han Yang
Zhulin An
Chengqing Yu
Libo Huang
Yongjun Xu
270
7
0
22 Feb 2025
TransAgent: Transfer Vision-Language Foundation Models with
  Heterogeneous Agent Collaboration
TransAgent: Transfer Vision-Language Foundation Models with Heterogeneous Agent CollaborationNeural Information Processing Systems (NeurIPS), 2024
Yiwei Guo
Shaobin Zhuang
Kunchang Li
Yu Qiao
Yali Wang
VLMCLIP
349
5
0
16 Oct 2024
Contrastive Representation Distillation
Contrastive Representation DistillationInternational Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.1K
1,197
0
23 Oct 2019
1