ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.05587
  4. Cited By
On the Efficiency of Subclass Knowledge Distillation in Classification
  Tasks
v1v2v3 (latest)

On the Efficiency of Subclass Knowledge Distillation in Classification Tasks

12 September 2021
A. Sajedi
Konstantinos N. Plataniotis
ArXiv (abs)PDFHTML

Papers citing "On the Efficiency of Subclass Knowledge Distillation in Classification Tasks"

4 / 4 papers shown
How to Train the Teacher Model for Effective Knowledge Distillation
How to Train the Teacher Model for Effective Knowledge Distillation
Shayan Mohajer Hamidi
Xizhen Deng
Renhao Tan
Linfeng Ye
Ahmed H. Salamah
274
11
0
25 Jul 2024
ATOM: Attention Mixer for Efficient Dataset Distillation
ATOM: Attention Mixer for Efficient Dataset Distillation
Samir Khaki
A. Sajedi
Kai Wang
Lucy Z. Liu
Y. Lawryshyn
Konstantinos N. Plataniotis
438
5
0
02 May 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation
  Based on Conditional Mutual Information
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual InformationInternational Conference on Learning Representations (ICLR), 2024
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
310
23
0
16 Jan 2024
ProbMCL: Simple Probabilistic Contrastive Learning for Multi-label
  Visual Classification
ProbMCL: Simple Probabilistic Contrastive Learning for Multi-label Visual ClassificationIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2024
A. Sajedi
Samir Khaki
Y. Lawryshyn
Konstantinos N. Plataniotis
VLM
363
4
0
02 Jan 2024
1