ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.19177
36
0

Knowledge Distillation for Semantic Segmentation: A Label Space Unification Approach

26 February 2025
Anton Backhaus
Thorsten Luettel
Mirko Maehlisch
ArXivPDFHTML
Abstract

An increasing number of datasets sharing similar domains for semantic segmentation have been published over the past few years. But despite the growing amount of overall data, it is still difficult to train bigger and better models due to inconsistency in taxonomy and/or labeling policies of different datasets. To this end, we propose a knowledge distillation approach that also serves as a label space unification method for semantic segmentation. In short, a teacher model is trained on a source dataset with a given taxonomy, then used to pseudo-label additional data for which ground truth labels of a related label space exist. By mapping the related taxonomies to the source taxonomy, we create constraints within which the model can predict pseudo-labels. Using the improved pseudo-labels we train student models that consistently outperform their teachers in two challenging domains, namely urban and off-road driving. Our ground truth-corrected pseudo-labels span over 12 and 7 public datasets with 388.230 and 18.558 images for the urban and off-road domains, respectively, creating the largest compound datasets for autonomous driving to date.

View on arXiv
@article{backhaus2025_2502.19177,
  title={ Knowledge Distillation for Semantic Segmentation: A Label Space Unification Approach },
  author={ Anton Backhaus and Thorsten Luettel and Mirko Maehlisch },
  journal={arXiv preprint arXiv:2502.19177},
  year={ 2025 }
}
Comments on this paper