ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.07428
  4. Cited By
Ranking Distillation: Learning Compact Ranking Models With High
  Performance for Recommender System

Ranking Distillation: Learning Compact Ranking Models With High Performance for Recommender System

19 September 2018
Jiaxi Tang
Ke Wang
ArXivPDFHTML

Papers citing "Ranking Distillation: Learning Compact Ranking Models With High Performance for Recommender System"

27 / 27 papers shown
Title
Theoretical Guarantees for LT-TTD: A Unified Transformer-based Architecture for Two-Level Ranking Systems
Theoretical Guarantees for LT-TTD: A Unified Transformer-based Architecture for Two-Level Ranking Systems
Ayoub Abraich
40
0
0
07 May 2025
MSCRS: Multi-modal Semantic Graph Prompt Learning Framework for Conversational Recommender Systems
MSCRS: Multi-modal Semantic Graph Prompt Learning Framework for Conversational Recommender Systems
Yibiao Wei
Jie Zou
Weikang Guo
Guoqing Wang
Xing Xu
Yang Yang
50
1
0
15 Apr 2025
External Large Foundation Model: How to Efficiently Serve Trillions of Parameters for Online Ads Recommendation
External Large Foundation Model: How to Efficiently Serve Trillions of Parameters for Online Ads Recommendation
Mingfu Liang
Xi Liu
Rong Jin
B. Liu
Qiuling Suo
...
Bo Long
Wenlin Chen
Rocky Liu
Santanu Kolay
H. Li
41
1
0
20 Feb 2025
A Hybrid Cross-Stage Coordination Pre-ranking Model for Online Recommendation Systems
A Hybrid Cross-Stage Coordination Pre-ranking Model for Online Recommendation Systems
Binglei Zhao
Houying Qi
Guang Xu
Mian Ma
Xiwei Zhao
Feng Mei
Sulong Xu
Jinghe Hu
57
0
0
17 Feb 2025
Invariant debiasing learning for recommendation via biased imputation
Invariant debiasing learning for recommendation via biased imputation
Ting Bai
Weijie Chen
Cheng Yang
C. Shi
123
1
0
28 Dec 2024
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Exploring Feature-based Knowledge Distillation for Recommender System: A Frequency Perspective
Zhangchi Zhu
Wei Zhang
38
0
0
16 Nov 2024
Augmenting Offline RL with Unlabeled Data
Augmenting Offline RL with Unlabeled Data
Zhao Wang
Briti Gangopadhyay
Jia-Fong Yeh
Shingo Takamatsu
OffRL
26
0
0
11 Jun 2024
Integrating Domain Knowledge for handling Limited Data in Offline RL
Integrating Domain Knowledge for handling Limited Data in Offline RL
Briti Gangopadhyay
Zhao Wang
Jia-Fong Yeh
Shingo Takamatsu
OffRL
32
0
0
11 Jun 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Deep Ranking Ensembles for Hyperparameter Optimization
Deep Ranking Ensembles for Hyperparameter Optimization
Abdus Salam Khazi
Sebastian Pineda Arango
Josif Grabocka
BDL
31
7
0
27 Mar 2023
Distillation from Heterogeneous Models for Top-K Recommendation
Distillation from Heterogeneous Models for Top-K Recommendation
SeongKu Kang
Wonbin Kweon
Dongha Lee
Jianxun Lian
Xing Xie
Hwanjo Yu
VLM
27
21
0
02 Mar 2023
Unbiased Knowledge Distillation for Recommendation
Unbiased Knowledge Distillation for Recommendation
Gang Chen
Jiawei Chen
Fuli Feng
Sheng Zhou
Xiangnan He
19
27
0
27 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
Cooperative Retriever and Ranker in Deep Recommenders
Cooperative Retriever and Ranker in Deep Recommenders
Xunpeng Huang
Defu Lian
Jin Chen
Liu Zheng
Xing Xie
Enhong Chen
VLM
AI4TS
22
11
0
28 Jun 2022
Consensus Learning from Heterogeneous Objectives for One-Class
  Collaborative Filtering
Consensus Learning from Heterogeneous Objectives for One-Class Collaborative Filtering
SeongKu Kang
Dongha Lee
Wonbin Kweon
Junyoung Hwang
Hwanjo Yu
9
12
0
26 Feb 2022
Secure Your Ride: Real-time Matching Success Rate Prediction for
  Passenger-Driver Pairs
Secure Your Ride: Real-time Matching Success Rate Prediction for Passenger-Driver Pairs
Yuandong Wang
Hongzhi Yin
Lian Wu
Tong Chen
Chunyang Liu
8
7
0
14 Sep 2021
Dual Correction Strategy for Ranking Distillation in Top-N Recommender
  System
Dual Correction Strategy for Ranking Distillation in Top-N Recommender System
Youngjune Lee
Kee-Eung Kim
14
19
0
08 Sep 2021
Privileged Graph Distillation for Cold Start Recommendation
Privileged Graph Distillation for Cold Start Recommendation
Shuai Wang
Kun Zhang
Le Wu
Haiping Ma
Richang Hong
Meng Wang
10
28
0
31 May 2021
DE-RRD: A Knowledge Distillation Framework for Recommender System
DE-RRD: A Knowledge Distillation Framework for Recommender System
SeongKu Kang
Junyoung Hwang
Wonbin Kweon
Hwanjo Yu
13
79
0
08 Dec 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
AIBench Scenario: Scenario-distilling AI Benchmarking
AIBench Scenario: Scenario-distilling AI Benchmarking
Wanling Gao
Fei Tang
Jianfeng Zhan
Xu Wen
Lei Wang
Zheng Cao
Chuanxin Lan
Chunjie Luo
Xiaoli Liu
Zihan Jiang
21
14
0
06 May 2020
AIBench Training: Balanced Industry-Standard AI Training Benchmarking
AIBench Training: Balanced Industry-Standard AI Training Benchmarking
Fei Tang
Wanling Gao
Jianfeng Zhan
Chuanxin Lan
Xu Wen
...
Yatao Li
Junchao Shao
Zhenyu Wang
Xiaoyu Wang
Hainan Ye
20
3
0
30 Apr 2020
Understanding and Improving Knowledge Distillation
Understanding and Improving Knowledge Distillation
Jiaxi Tang
Rakesh Shivanna
Zhe Zhao
Dong Lin
Anima Singh
Ed H. Chi
Sagar Jain
8
129
0
10 Feb 2020
Collaborative Distillation for Top-N Recommendation
Collaborative Distillation for Top-N Recommendation
Jae-woong Lee
Minjin Choi
Jongwuk Lee
Hyunjung Shim
9
47
0
13 Nov 2019
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
267
404
0
09 Apr 2018
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
177
1,185
0
30 Nov 2014
Convolutional Neural Networks for Sentence Classification
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
250
13,364
0
25 Aug 2014
1