ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.18490
  4. Cited By
I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation

I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation

24 February 2025
Ayoub Karine
Thibault Napoléon
M. Jridi
    VLM
ArXivPDFHTML

Papers citing "I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation"

3 / 3 papers shown
Title
Categories of Response-Based, Feature-Based, and Relation-Based
  Knowledge Distillation
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
76
21
0
19 Jun 2023
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
172
1,018
0
06 Mar 2020
ENet: A Deep Neural Network Architecture for Real-Time Semantic
  Segmentation
ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation
Adam Paszke
Abhishek Chaurasia
Sangpil Kim
Eugenio Culurciello
SSeg
204
2,034
0
07 Jun 2016
1