ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.04457
  4. Cited By
Triplet Distillation for Deep Face Recognition
v1v2 (latest)

Triplet Distillation for Deep Face Recognition

International Conference on Information Photonics (ICIP), 2019
11 May 2019
Yushu Feng
Huan Wang
Daniel T. Yi
Roland Hu
    CVBM
ArXiv (abs)PDFHTML

Papers citing "Triplet Distillation for Deep Face Recognition"

20 / 20 papers shown
Honey, I Shrunk the Language Model: Impact of Knowledge Distillation Methods on Performance and Explainability
Honey, I Shrunk the Language Model: Impact of Knowledge Distillation Methods on Performance and Explainability
Daniel Hendriks
Philipp Spitzer
Niklas Kühl
G. Satzger
372
3
0
22 Apr 2025
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
Fadi Boutros
Vitomir Štruc
Naser Damer
284
10
0
01 Jul 2024
Attention-guided Feature Distillation for Semantic Segmentation
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
684
8
0
08 Mar 2024
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network PruningIEEE Transactions on Knowledge and Data Engineering (TKDE), 2023
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
502
7
0
23 Nov 2023
Learning Using Generated Privileged Information by Text-to-Image
  Diffusion Models
Learning Using Generated Privileged Information by Text-to-Image Diffusion ModelsInternational Conference on Pattern Recognition (ICPR), 2023
Rafael-Edy Menadil
Mariana-Iuliana Georgescu
Radu Tudor Ionescu
VLMDiffM
215
0
0
26 Sep 2023
Computation-efficient Deep Learning for Computer Vision: A Survey
Computation-efficient Deep Learning for Computer Vision: A Survey
Yulin Wang
Yizeng Han
Chaofei Wang
Shiji Song
Qi Tian
Gao Huang
VLM
309
33
0
27 Aug 2023
AICSD: Adaptive Inter-Class Similarity Distillation for Semantic
  Segmentation
AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation
Amir M. Mansourian
Rozhan Ahmadi
S. Kasaei
295
5
0
08 Aug 2023
Grouped Knowledge Distillation for Deep Face Recognition
Grouped Knowledge Distillation for Deep Face RecognitionAAAI Conference on Artificial Intelligence (AAAI), 2023
Weisong Zhao
Xiangyu Zhu
Kaiwen Guo
Xiaoyu Zhang
Zhen Lei
CVBM
190
11
0
10 Apr 2023
Expanding Knowledge Graphs with Humans in the Loop
Expanding Knowledge Graphs with Humans in the Loop
Emaad A. Manzoor
Jordan Tong
S. Vijayaraghavan
Rui Li
270
4
0
10 Dec 2022
Lightning Fast Video Anomaly Detection via Adversarial Knowledge
  Distillation
Lightning Fast Video Anomaly Detection via Adversarial Knowledge DistillationComputer Vision and Image Understanding (CVIU), 2022
Florinel-Alin Croitoru
Nicolae-Cătălin Ristea
D. Dascalescu
Radu Tudor Ionescu
Fahad Shahbaz Khan
M. Shah
330
6
0
28 Nov 2022
Octuplet Loss: Make Face Recognition Robust to Image Resolution
Octuplet Loss: Make Face Recognition Robust to Image ResolutionIEEE International Conference on Automatic Face & Gesture Recognition (FG), 2022
Martin Knoche
Mohamed Elkadeem
S. Hörmann
Gerhard Rigoll
CVBM
359
14
0
14 Jul 2022
Evaluation-oriented Knowledge Distillation for Deep Face Recognition
Evaluation-oriented Knowledge Distillation for Deep Face RecognitionComputer Vision and Pattern Recognition (CVPR), 2022
Yanhua Huang
Jiaxiang Wu
Xingkun Xu
Shouhong Ding
CVBM
296
39
0
06 Jun 2022
CoupleFace: Relation Matters for Face Recognition Distillation
CoupleFace: Relation Matters for Face Recognition DistillationEuropean Conference on Computer Vision (ECCV), 2022
Jiaheng Liu
Haoyu Qin
Yichao Wu
Jinyang Guo
Ding Liang
Ke Xu
CVBM
270
21
0
12 Apr 2022
Teacher-Student Training and Triplet Loss to Reduce the Effect of
  Drastic Face Occlusion
Teacher-Student Training and Triplet Loss to Reduce the Effect of Drastic Face OcclusionMachine Vision and Applications (MVA), 2021
Mariana-Iuliana Georgescu
Georgian-Emilian Duta
Radu Tudor Ionescu
3DHCVBM
192
21
0
20 Nov 2021
Confidence Conditioned Knowledge Distillation
Confidence Conditioned Knowledge Distillation
Sourav Mishra
Suresh Sundaram
150
2
0
06 Jul 2021
Self-restrained Triplet Loss for Accurate Masked Face Recognition
Self-restrained Triplet Loss for Accurate Masked Face RecognitionPattern Recognition (Pattern Recogn.), 2021
Fadi Boutros
Naser Damer
Florian Kirchbuchner
Arjan Kuijper
CVBM
188
121
0
02 Mar 2021
Self Regulated Learning Mechanism for Data Efficient Knowledge
  Distillation
Self Regulated Learning Mechanism for Data Efficient Knowledge DistillationIEEE International Joint Conference on Neural Network (IJCNN), 2021
Sourav Mishra
Suresh Sundaram
188
1
0
14 Feb 2021
Teacher-Student Training and Triplet Loss for Facial Expression
  Recognition under Occlusion
Teacher-Student Training and Triplet Loss for Facial Expression Recognition under Occlusion
Mariana-Iuliana Georgescu
Radu Tudor Ionescu
CVBM
292
26
0
03 Aug 2020
MarginDistillation: distillation for margin-based softmax
MarginDistillation: distillation for margin-based softmax
D. Svitov
S. Alyamkin
CVBM
162
9
0
05 Mar 2020
Deep geometric knowledge distillation with graphs
Deep geometric knowledge distillation with graphsIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019
Carlos Lassance
Myriam Bontonou
G. B. Hacene
Vincent Gripon
Jian Tang
Antonio Ortega
198
44
0
08 Nov 2019
1
Page 1 of 1