Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2007.08061
Cited By
Fast Distributed Bandits for Online Recommendation Systems
16 July 2020
K. Mahadik
Qingyun Wu
Shuai Li
Amit Sabne
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Fast Distributed Bandits for Online Recommendation Systems"
7 / 7 papers shown
Title
Towards Domain Adaptive Neural Contextual Bandits
Ziyan Wang
Hao Wang
Hao Wang
208
0
0
13 Jun 2024
ProtoBandit: Efficient Prototype Selection via Multi-Armed Bandits
A. Chaudhuri
Pratik Jawanpuria
Bamdev Mishra
102
3
0
04 Oct 2022
Context Uncertainty in Contextual Bandits with Applications to Recommender Systems
Hao Wang
Yifei Ma
Hao Ding
Yuyang Wang
94
6
0
01 Feb 2022
Asynchronous Upper Confidence Bound Algorithms for Federated Linear Bandits
Chuanhao Li
Hongning Wang
FedML
113
40
0
04 Oct 2021
Federated Multi-Armed Bandits
Chengshuai Shi
Cong Shen
FedML
144
93
0
28 Jan 2021
Efficient Contextual Bandits with Continuous Actions
Maryam Majzoubi
Chicheng Zhang
Rajan Chari
A. Krishnamurthy
John Langford
Aleksandrs Slivkins
OffRL
80
32
0
10 Jun 2020
Continuous Action Reinforcement Learning from a Mixture of Interpretable Experts
R. Akrour
Davide Tateo
Jan Peters
58
22
0
10 Jun 2020
1