ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.03459
  4. Cited By
Dual Correction Strategy for Ranking Distillation in Top-N Recommender
  System

Dual Correction Strategy for Ranking Distillation in Top-N Recommender System

8 September 2021
Youngjune Lee
Kee-Eung Kim
ArXivPDFHTML

Papers citing "Dual Correction Strategy for Ranking Distillation in Top-N Recommender System"

5 / 5 papers shown
Title
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Zhangchi Zhu
Wei Zhang
43
0
0
16 Nov 2024
Dynamic Sparse Learning: A Novel Paradigm for Efficient Recommendation
Dynamic Sparse Learning: A Novel Paradigm for Efficient Recommendation
Shuyao Wang
Yongduo Sui
Jiancan Wu
Zhi Zheng
Hui Xiong
11
16
0
05 Feb 2024
Distillation from Heterogeneous Models for Top-K Recommendation
Distillation from Heterogeneous Models for Top-K Recommendation
SeongKu Kang
Wonbin Kweon
Dongha Lee
Jianxun Lian
Xing Xie
Hwanjo Yu
VLM
27
21
0
02 Mar 2023
Debias the Black-box: A Fair Ranking Framework via Knowledge
  Distillation
Debias the Black-box: A Fair Ranking Framework via Knowledge Distillation
Z. Zhu
Shijing Si
Jianzong Wang
Yaodong Yang
Jing Xiao
FedML
14
3
0
24 Aug 2022
Consensus Learning from Heterogeneous Objectives for One-Class
  Collaborative Filtering
Consensus Learning from Heterogeneous Objectives for One-Class Collaborative Filtering
SeongKu Kang
Dongha Lee
Wonbin Kweon
Junyoung Hwang
Hwanjo Yu
17
12
0
26 Feb 2022
1