Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.09852
Cited By
Cross-Task Knowledge Distillation in Multi-Task Recommendation
20 February 2022
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Cross-Task Knowledge Distillation in Multi-Task Recommendation"
6 / 6 papers shown
Title
Knowledge Migration Framework for Smart Contract Vulnerability Detection
Luqi Wang
Wenbao Jiang
76
0
0
15 Dec 2024
Learning to Maximize Mutual Information for Chain-of-Thought Distillation
Xin Chen
Hanxian Huang
Yanjun Gao
Yi Wang
Jishen Zhao
Ke Ding
30
11
0
05 Mar 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
14
16
0
08 Aug 2023
Unbiased Knowledge Distillation for Recommendation
Gang Chen
Jiawei Chen
Fuli Feng
Sheng Zhou
Xiangnan He
6
27
0
27 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
6
35
0
28 Oct 2022
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
267
402
0
09 Apr 2018
1