ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.02586
  4. Cited By
MarginDistillation: distillation for margin-based softmax

MarginDistillation: distillation for margin-based softmax

5 March 2020
D. Svitov
S. Alyamkin
    CVBM
ArXiv (abs)PDFHTML

Papers citing "MarginDistillation: distillation for margin-based softmax"

5 / 5 papers shown
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
Fadi Boutros
Vitomir Štruc
Naser Damer
358
11
0
01 Jul 2024
Grouped Knowledge Distillation for Deep Face Recognition
Grouped Knowledge Distillation for Deep Face RecognitionAAAI Conference on Artificial Intelligence (AAAI), 2023
Weisong Zhao
Xiangyu Zhu
Kaiwen Guo
Xiaoyu Zhang
Zhen Lei
CVBM
265
11
0
10 Apr 2023
Evaluation-oriented Knowledge Distillation for Deep Face Recognition
Evaluation-oriented Knowledge Distillation for Deep Face RecognitionComputer Vision and Pattern Recognition (CVPR), 2022
Yanhua Huang
Jiaxiang Wu
Xingkun Xu
Shouhong Ding
CVBM
391
39
0
06 Jun 2022
KDCTime: Knowledge Distillation with Calibration on InceptionTime for
  Time-series Classification
KDCTime: Knowledge Distillation with Calibration on InceptionTime for Time-series ClassificationInformation Sciences (Inf. Sci.), 2021
Xueyuan Gong
Yain-Whar Si
Yongqi Tian
Cong Lin
Xinyuan Zhang
Xiaoxiang Liu
239
10
0
04 Dec 2021
Prototype Memory for Large-scale Face Representation Learning
Prototype Memory for Large-scale Face Representation LearningIEEE Access (IEEE Access), 2021
Evgeny Smirnov
Nikita Garaev
V. Galyuk
Evgeny Lukyanets
CVBM
285
4
0
05 May 2021
1
Page 1 of 1