Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.03459
Cited By
Dual Correction Strategy for Ranking Distillation in Top-N Recommender System
8 September 2021
Youngjune Lee
Kee-Eung Kim
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Dual Correction Strategy for Ranking Distillation in Top-N Recommender System"
5 / 5 papers shown
Title
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Zhangchi Zhu
Wei Zhang
43
0
0
16 Nov 2024
Dynamic Sparse Learning: A Novel Paradigm for Efficient Recommendation
Shuyao Wang
Yongduo Sui
Jiancan Wu
Zhi Zheng
Hui Xiong
11
16
0
05 Feb 2024
Distillation from Heterogeneous Models for Top-K Recommendation
SeongKu Kang
Wonbin Kweon
Dongha Lee
Jianxun Lian
Xing Xie
Hwanjo Yu
VLM
27
21
0
02 Mar 2023
Debias the Black-box: A Fair Ranking Framework via Knowledge Distillation
Z. Zhu
Shijing Si
Jianzong Wang
Yaodong Yang
Jing Xiao
FedML
14
3
0
24 Aug 2022
Consensus Learning from Heterogeneous Objectives for One-Class Collaborative Filtering
SeongKu Kang
Dongha Lee
Wonbin Kweon
Junyoung Hwang
Hwanjo Yu
17
12
0
26 Feb 2022
1