ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.01102
  4. Cited By
Dual Teacher Knowledge Distillation with Domain Alignment for Face
  Anti-spoofing

Dual Teacher Knowledge Distillation with Domain Alignment for Face Anti-spoofing

2 January 2024
Zhe Kong
Wentian Zhang
Tao Wang
Kaihao Zhang
Yuexiang Li
Xiaoying Tang
Tong Lu
    AAMLCVBM
ArXiv (abs)PDFHTML

Papers citing "Dual Teacher Knowledge Distillation with Domain Alignment for Face Anti-spoofing"

1 / 1 papers shown
Title
Contrastive Representation Distillation
Contrastive Representation DistillationInternational Conference on Learning Representations (ICLR), 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
1.0K
1,195
0
23 Oct 2019
1