ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.16563
  4. Cited By
MST-KD: Multiple Specialized Teachers Knowledge Distillation for Fair
  Face Recognition

MST-KD: Multiple Specialized Teachers Knowledge Distillation for Fair Face Recognition

29 August 2024
Eduarda Caldeira
Jaime S. Cardoso
Ana F. Sequeira
Pedro C. Neto
    CVBM
ArXivPDFHTML

Papers citing "MST-KD: Multiple Specialized Teachers Knowledge Distillation for Fair Face Recognition"

2 / 2 papers shown
Title
Collaborative Multi-Teacher Knowledge Distillation for Learning Low
  Bit-width Deep Neural Networks
Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks
Cuong Pham
Tuan Hoang
Thanh-Toan Do
FedML
MQ
18
13
0
27 Oct 2022
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,290
0
05 Nov 2016
1