ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.07516
  4. Cited By
HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification
v1v2 (latest)

HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification

10 July 2024
Omar S. El-Assiouti
Ghada Hamed
Dina Khattab
H. M. Ebied
ArXiv (abs)PDFHTML

Papers citing "HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification"

2 / 2 papers shown
Title
Dynamic Weight Adjustment for Knowledge Distillation: Leveraging Vision Transformer for High-Accuracy Lung Cancer Detection and Real-Time Deployment
Dynamic Weight Adjustment for Knowledge Distillation: Leveraging Vision Transformer for High-Accuracy Lung Cancer Detection and Real-Time Deployment
Saif Ur Rehman Khan
Muhammad Nabeel Asim
Sebastian Vollmer
Andreas Dengel
85
0
0
23 Oct 2025
WeCKD: Weakly-supervised Chained Distillation Network for Efficient Multimodal Medical Imaging
WeCKD: Weakly-supervised Chained Distillation Network for Efficient Multimodal Medical Imaging
M. R
M. R
Sami Azam
Asif Karim
Jemima Beissbarth
Amanda Leach
MedIm
117
0
0
16 Oct 2025
1