ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.14662
  4. Cited By
Cross Architecture Distillation for Face Recognition

Cross Architecture Distillation for Face Recognition

ACM Multimedia (ACM MM), 2023
26 June 2023
Weisong Zhao
Xiangyu Zhu
Zhixiang He
Xiaoyu Zhang
Zhen Lei
    CVBM
ArXiv (abs)PDFHTML

Papers citing "Cross Architecture Distillation for Face Recognition"

6 / 6 papers shown
Title
UHKD: A Unified Framework for Heterogeneous Knowledge Distillation via Frequency-Domain Representations
UHKD: A Unified Framework for Heterogeneous Knowledge Distillation via Frequency-Domain Representations
Fengming Yu
Haiwei Pan
Kejia Zhang
Jian Guan
Haiying Jiang
97
0
0
28 Oct 2025
Distilling Knowledge from Heterogeneous Architectures for Semantic Segmentation
Distilling Knowledge from Heterogeneous Architectures for Semantic SegmentationAAAI Conference on Artificial Intelligence (AAAI), 2025
Yuanmin Huang
Kai Hu
Yuhui Zhang
Z. Chen
Xieping Gao
183
0
0
10 Apr 2025
Multi-Level Decoupled Relational Distillation for Heterogeneous Architectures
Yaoxin Yang
Peng Ye
Weihao Lin
Kangcong Li
Yan Wen
Jia Hao
Tao Chen
265
0
0
10 Feb 2025
Applications of Knowledge Distillation in Remote Sensing: A Survey
Applications of Knowledge Distillation in Remote Sensing: A SurveyInformation Fusion (Inf. Fusion), 2024
Yassine Himeur
N. Aburaed
O. Elharrouss
Iraklis Varlamis
Shadi Atalla
Shadi Atalla
Hussain Al Ahmad
230
7
0
18 Sep 2024
Cross-Architecture Auxiliary Feature Space Translation for Efficient
  Few-Shot Personalized Object Detection
Cross-Architecture Auxiliary Feature Space Translation for Efficient Few-Shot Personalized Object Detection
F. Barbato
Umberto Michieli
J. Moon
Pietro Zanuttigh
Mete Ozay
205
4
0
01 Jul 2024
Knowledge Distillation in Vision Transformers: A Critical Review
Knowledge Distillation in Vision Transformers: A Critical Review
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
236
22
0
04 Feb 2023
1