ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.12926
  4. Cited By
Distill-to-Label: Weakly Supervised Instance Labeling Using Knowledge
  Distillation

Distill-to-Label: Weakly Supervised Instance Labeling Using Knowledge Distillation

International Conference on Machine Learning and Applications (ICMLA), 2019
26 July 2019
Jayaraman J. Thiagarajan
Satyananda Kashyap
Alexandros Karargyris
    VLM
ArXiv (abs)PDFHTML

Papers citing "Distill-to-Label: Weakly Supervised Instance Labeling Using Knowledge Distillation"

3 / 3 papers shown
A Survey on Cell Nuclei Instance Segmentation and Classification:
  Leveraging Context and Attention
A Survey on Cell Nuclei Instance Segmentation and Classification: Leveraging Context and Attention
João D. Nunes
D. Montezuma
Domingos Oliveira
Tania Pereira
Jaime S. Cardoso
274
0
0
26 Jul 2024
Visualizing the embedding space to explain the effect of knowledge
  distillation
Visualizing the embedding space to explain the effect of knowledge distillationAsian Conference on Pattern Recognition (ACPR), 2021
Hyun Seung Lee
C. Wallraven
151
1
0
09 Oct 2021
Categorical Relation-Preserving Contrastive Knowledge Distillation for
  Medical Image Classification
Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification
Xiaohan Xing
Yuenan Hou
Han Li
Yixuan Yuan
Jiaming Song
Max Meng
VLM
135
46
0
07 Jul 2021
1
Page 1 of 1