Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.14494
Cited By
Learning to Project for Cross-Task Knowledge Distillation
21 March 2024
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Learning to Project for Cross-Task Knowledge Distillation"
9 / 9 papers shown
Title
V
k
D
:
V_kD:
V
k
D
:
Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
42
10
0
10 Mar 2024
Linearly Mapping from Image to Text Space
Jack Merullo
Louis Castricato
Carsten Eickhoff
Ellie Pavlick
VLM
159
104
0
30 Sep 2022
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
29
44
0
20 Feb 2022
Federated Active Learning (F-AL): an Efficient Annotation Strategy for Federated Learning
J. Ahn
Yeeun Ma
Seoyun Park
Cheolwoo You
FedML
30
22
0
01 Feb 2022
Efficiently Identifying Task Groupings for Multi-Task Learning
Christopher Fifty
Ehsan Amid
Zhe Zhao
Tianhe Yu
Rohan Anil
Chelsea Finn
201
238
1
10 Sep 2021
Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Yanbei Chen
Yongqin Xian
A. Sophia Koepke
Ying Shan
Zeynep Akata
76
80
0
22 Apr 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Image-to-Image Translation with Conditional Adversarial Networks
Phillip Isola
Jun-Yan Zhu
Tinghui Zhou
Alexei A. Efros
SSeg
212
19,387
0
21 Nov 2016
U-Net: Convolutional Networks for Biomedical Image Segmentation
Olaf Ronneberger
Philipp Fischer
Thomas Brox
SSeg
3DV
232
75,445
0
18 May 2015
1