ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.18490
87
0

I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation

24 February 2025
Ayoub Karine
Thibault Napoléon
M. Jridi
    VLM
ArXivPDFHTML
Abstract

This paper proposes a new knowledge distillation method tailored for image semantic segmentation, termed Intra- and Inter-Class Knowledge Distillation (I2CKD). The focus of this method is on capturing and transferring knowledge between the intermediate layers of teacher (cumbersome model) and student (compact model). For knowledge extraction, we exploit class prototypes derived from feature maps. To facilitate knowledge transfer, we employ a triplet loss in order to minimize intra-class variances and maximize inter-class variances between teacher and student prototypes. Consequently, I2CKD enables the student to better mimic the feature representation of the teacher for each class, thereby enhancing the segmentation performance of the compact network. Extensive experiments on three segmentation datasets, i.e., Cityscapes, Pascal VOC and CamVid, using various teacher-student network pairs demonstrate the effectiveness of the proposed method.

View on arXiv
@article{karine2025_2403.18490,
  title={ I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation },
  author={ Ayoub Karine and Thibault Napoléon and Maher Jridi },
  journal={arXiv preprint arXiv:2403.18490},
  year={ 2025 }
}
Comments on this paper