Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2101.07308
Cited By
Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains
18 January 2021
Le Thanh Nguyen-Meidine
Atif Belal
M. Kiran
Jose Dolz
Louis-Antoine Blais-Morin
Eric Granger
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains"
2 / 2 papers shown
Title
Pre-Training Transformers for Domain Adaptation
Burhan Ul Tayyab
Nicholas Chua
ViT
13
2
0
18 Dec 2021
Unsupervised Domain Adaptation in the Dissimilarity Space for Person Re-identification
Djebril Mekhazni
Amran Bhuiyan
G. Ekladious
Eric Granger
OOD
71
92
0
27 Jul 2020
1