Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2401.06826
Cited By
Direct Distillation between Different Domains
12 January 2024
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Joey Tianyi Zhou
Chen Gong
Masashi Sugiyama
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Direct Distillation between Different Domains"
8 / 8 papers shown
Title
RFMI: Estimating Mutual Information on Rectified Flow for Text-to-Image Alignment
Chao Wang
Giulio Franzese
A. Finamore
Pietro Michiardi
58
0
0
18 Mar 2025
Decompose, Adjust, Compose: Effective Normalization by Playing with Frequency for Domain Generalization
Sangrok Lee
Jongseong Bae
Ha Young Kim
OOD
25
18
0
04 Mar 2023
HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression
Chenhe Dong
Yaliang Li
Ying Shen
Minghui Qiu
VLM
23
6
0
16 Oct 2021
Model Adaptation: Historical Contrastive Learning for Unsupervised Domain Adaptation without Source Data
Jiaxing Huang
Dayan Guan
Aoran Xiao
Shijian Lu
126
210
0
07 Oct 2021
Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation
Jichang Li
Guanbin Li
Yemin Shi
Yizhou Yu
56
116
0
19 Apr 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
105
99
0
12 Feb 2021
Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer
Jian Liang
Dapeng Hu
Yunbo Wang
R. He
Jiashi Feng
128
249
0
14 Dec 2020
Deep Domain-Adversarial Image Generation for Domain Generalisation
Kaiyang Zhou
Yongxin Yang
Timothy M. Hospedales
Tao Xiang
OOD
201
402
0
12 Mar 2020
1