ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.07308
  4. Cited By
Knowledge Distillation Methods for Efficient Unsupervised Adaptation
  Across Multiple Domains

Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains

18 January 2021
Le Thanh Nguyen-Meidine
Atif Belal
M. Kiran
Jose Dolz
Louis-Antoine Blais-Morin
Eric Granger
ArXivPDFHTML

Papers citing "Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains"

2 / 2 papers shown
Title
Pre-Training Transformers for Domain Adaptation
Pre-Training Transformers for Domain Adaptation
Burhan Ul Tayyab
Nicholas Chua
ViT
13
2
0
18 Dec 2021
Unsupervised Domain Adaptation in the Dissimilarity Space for Person
  Re-identification
Unsupervised Domain Adaptation in the Dissimilarity Space for Person Re-identification
Djebril Mekhazni
Amran Bhuiyan
G. Ekladious
Eric Granger
OOD
71
92
0
27 Jul 2020
1