ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.04106
  4. Cited By
Ensemble Knowledge Distillation for CTR Prediction
v1v2 (latest)

Ensemble Knowledge Distillation for CTR Prediction

8 November 2020
Jieming Zhu
Jinyang Liu
Weiqi Li
Jincai Lai
Xiuqiang He
Liang Chen
Zibin Zheng
ArXiv (abs)PDFHTML

Papers citing "Ensemble Knowledge Distillation for CTR Prediction"

16 / 16 papers shown
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Philippe Formont
Maxime Darrin
Banafsheh Karimian
Jackie Chi Kit Cheung
Eric Granger
Ismail Ben Ayed
Mohammadhadi Shateri
Pablo Piantanida
226
2
0
21 Oct 2025
External Large Foundation Model: How to Efficiently Serve Trillions of Parameters for Online Ads Recommendation
External Large Foundation Model: How to Efficiently Serve Trillions of Parameters for Online Ads RecommendationThe Web Conference (WWW), 2025
Mingfu Liang
Xi Liu
Rong Jin
B. Liu
Qiuling Suo
...
Bo Long
Wenlin Chen
Rocky Liu
Santanu Kolay
Haoyang Li
569
13
0
20 Feb 2025
A Unified Knowledge-Distillation and Semi-Supervised Learning Framework to Improve Industrial Ads Delivery Systems
A Unified Knowledge-Distillation and Semi-Supervised Learning Framework to Improve Industrial Ads Delivery Systems
Hamid Eghbalzadeh
Yang Wang
Rui Li
Yuji Mo
Qin Ding
...
Shuo Gu
Nima Noorshams
Sem Park
Bo Long
Xue Feng
391
0
0
05 Feb 2025
Retrieval and Distill: A Temporal Data Shift-Free Paradigm for Online Recommendation System
Retrieval and Distill: A Temporal Data Shift-Free Paradigm for Online Recommendation System
Lei Zheng
Ning Li
Weinan Zhang
Yong Yu
AI4TS
405
0
0
24 Apr 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
449
47
0
08 Aug 2023
Pairwise Ranking Losses of Click-Through Rates Prediction for Welfare
  Maximization in Ad Auctions
Pairwise Ranking Losses of Click-Through Rates Prediction for Welfare Maximization in Ad AuctionsInternational Conference on Machine Learning (ICML), 2023
Boxiang Lyu
Zhe Feng
Zachary Robertson
Sanmi Koyejo
170
3
0
01 Jun 2023
Distillation from Heterogeneous Models for Top-K Recommendation
Distillation from Heterogeneous Models for Top-K RecommendationThe Web Conference (WWW), 2023
SeongKu Kang
Wonbin Kweon
Dongha Lee
Jianxun Lian
Xing Xie
Hwanjo Yu
VLM
216
35
0
02 Mar 2023
Unbiased Knowledge Distillation for Recommendation
Unbiased Knowledge Distillation for RecommendationWeb Search and Data Mining (WSDM), 2022
Gang Chen
Jiawei Chen
Fuli Feng
Sheng Zhou
Xiangnan He
194
42
0
27 Nov 2022
Directed Acyclic Graph Factorization Machines for CTR Prediction via
  Knowledge Distillation
Directed Acyclic Graph Factorization Machines for CTR Prediction via Knowledge DistillationWeb Search and Data Mining (WSDM), 2022
Zhen Tian
Ting Bai
Ziyan Zhang
Zhiyuan Xu
Kangyi Lin
Ji-Rong Wen
Wayne Xin Zhao
249
28
0
21 Nov 2022
FeatureBox: Feature Engineering on GPUs for Massive-Scale Ads Systems
FeatureBox: Feature Engineering on GPUs for Massive-Scale Ads Systems
Weijie Zhao
Xuewu Jiao
Xinsheng Luo
Jingxue Li
Belhal Karimi
Ping Li
198
2
0
26 Sep 2022
A Comprehensive Survey on Trustworthy Recommender Systems
A Comprehensive Survey on Trustworthy Recommender Systems
Wenqi Fan
Xiangyu Zhao
Xiao Chen
Jingran Su
Jingtong Gao
...
Qidong Liu
Yiqi Wang
Hanfeng Xu
Lei Chen
Qing Li
FaML
298
68
0
21 Sep 2022
Click-Through Rate Prediction in Online Advertising: A Literature Review
Click-Through Rate Prediction in Online Advertising: A Literature ReviewInformation Processing & Management (IPM), 2022
Yanwu Yang
Panyu Zhai
CML3DV
294
146
0
22 Feb 2022
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Cross-Task Knowledge Distillation in Multi-Task RecommendationAAAI Conference on Artificial Intelligence (AAAI), 2022
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
397
57
0
20 Feb 2022
Learning-To-Ensemble by Contextual Rank Aggregation in E-Commerce
Learning-To-Ensemble by Contextual Rank Aggregation in E-CommerceWeb Search and Data Mining (WSDM), 2021
Xuesi Wang
Guangda Huzhang
Qianying Lin
Qing Da
188
2
0
19 Jul 2021
Topology Distillation for Recommender System
Topology Distillation for Recommender System
SeongKu Kang
Junyoung Hwang
Wonbin Kweon
Hwanjo Yu
250
49
0
16 Jun 2021
BARS-CTR: Open Benchmarking for Click-Through Rate Prediction
BARS-CTR: Open Benchmarking for Click-Through Rate PredictionInternational Conference on Information and Knowledge Management (CIKM), 2020
Jieming Zhu
Jinyang Liu
Shuai Yang
Tao Gui
Xiuqiang He
495
170
0
12 Sep 2020
1
Page 1 of 1