Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.08436
Cited By
Residual Bootstrap Exploration for Bandit Algorithms
19 February 2020
ChiHua Wang
Yang Yu
Botao Hao
Guang Cheng
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Residual Bootstrap Exploration for Bandit Algorithms"
5 / 5 papers shown
Title
Multiplier Bootstrap-based Exploration
Runzhe Wan
Haoyu Wei
Branislav Kveton
R. Song
52
3
0
03 Feb 2023
From Optimality to Robustness: Dirichlet Sampling Strategies in Stochastic Bandits
Dorian Baudry
Patrick Saux
Odalric-Ambrym Maillard
65
7
0
18 Nov 2021
Online Bootstrap Inference For Policy Evaluation in Reinforcement Learning
Pratik Ramprasad
Yuantong Li
Zhuoran Yang
Zhaoran Wang
W. Sun
Guang Cheng
OffRL
137
28
0
08 Aug 2021
Sub-sampling for Efficient Non-Parametric Bandit Exploration
Dorian Baudry
E. Kaufmann
Odalric-Ambrym Maillard
33
13
0
27 Oct 2020
BanditPAM: Almost Linear Time
k
k
k
-Medoids Clustering via Multi-Armed Bandits
Mo Tiwari
Martin Jinye Zhang
James Mayclin
Sebastian Thrun
Chris Piech
Ilan Shomorony
56
11
0
11 Jun 2020
1