ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.09852
  4. Cited By
Cross-Task Knowledge Distillation in Multi-Task Recommendation

Cross-Task Knowledge Distillation in Multi-Task Recommendation

20 February 2022
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
ArXivPDFHTML

Papers citing "Cross-Task Knowledge Distillation in Multi-Task Recommendation"

6 / 6 papers shown
Title
Knowledge Migration Framework for Smart Contract Vulnerability Detection
Knowledge Migration Framework for Smart Contract Vulnerability Detection
Luqi Wang
Wenbao Jiang
76
0
0
15 Dec 2024
Learning to Maximize Mutual Information for Chain-of-Thought
  Distillation
Learning to Maximize Mutual Information for Chain-of-Thought Distillation
Xin Chen
Hanxian Huang
Yanjun Gao
Yi Wang
Jishen Zhao
Ke Ding
35
11
0
05 Mar 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
14
16
0
08 Aug 2023
Unbiased Knowledge Distillation for Recommendation
Unbiased Knowledge Distillation for Recommendation
Gang Chen
Jiawei Chen
Fuli Feng
Sheng Zhou
Xiangnan He
8
27
0
27 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
11
35
0
28 Oct 2022
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
267
402
0
09 Apr 2018
1