Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.14729
Cited By
Unbiased Knowledge Distillation for Recommendation
27 November 2022
Gang Chen
Jiawei Chen
Fuli Feng
Sheng Zhou
Xiangnan He
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Unbiased Knowledge Distillation for Recommendation"
5 / 5 papers shown
Title
External Large Foundation Model: How to Efficiently Serve Trillions of Parameters for Online Ads Recommendation
Mingfu Liang
Xi Liu
Rong Jin
B. Liu
Qiuling Suo
...
Bo Long
Wenlin Chen
Rocky Liu
Santanu Kolay
H. Li
38
1
0
20 Feb 2025
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Zhangchi Zhu
Wei Zhang
28
0
0
16 Nov 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
14
16
0
08 Aug 2023
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
29
44
0
20 Feb 2022
Distilling Causal Effect of Data in Class-Incremental Learning
Xinting Hu
Kaihua Tang
C. Miao
Xiansheng Hua
Hanwang Zhang
CML
163
174
0
02 Mar 2021
1