ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.04168
  4. Cited By
Augmentation-Free Dense Contrastive Knowledge Distillation for Efficient
  Semantic Segmentation

Augmentation-Free Dense Contrastive Knowledge Distillation for Efficient Semantic Segmentation

7 December 2023
Jiawei Fan
Chao Li
Xiaolong Liu
Meina Song
Anbang Yao
ArXiv (abs)PDFHTMLGithub (15★)

Papers citing "Augmentation-Free Dense Contrastive Knowledge Distillation for Efficient Semantic Segmentation"

4 / 4 papers shown
Distilling Knowledge from Heterogeneous Architectures for Semantic Segmentation
Distilling Knowledge from Heterogeneous Architectures for Semantic SegmentationAAAI Conference on Artificial Intelligence (AAAI), 2025
Yuanmin Huang
Kai Hu
Yuhui Zhang
Z. Chen
Xieping Gao
258
0
0
10 Apr 2025
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
341
6
0
13 Mar 2024
Generative Denoise Distillation: Simple Stochastic Noises Induce
  Efficient Knowledge Transfer for Dense Prediction
Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense PredictionKnowledge-Based Systems (KBS), 2024
Zhaoge Liu
Xiaohao Xu
Yunkang Cao
Nong Sang
VLM
288
0
0
16 Jan 2024
Contrastive Representation Distillation
Contrastive Representation DistillationInternational Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.3K
1,209
0
23 Oct 2019
1