ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.08436
  4. Cited By
Residual Bootstrap Exploration for Bandit Algorithms

Residual Bootstrap Exploration for Bandit Algorithms

19 February 2020
ChiHua Wang
Yang Yu
Botao Hao
Guang Cheng
ArXiv (abs)PDFHTML

Papers citing "Residual Bootstrap Exploration for Bandit Algorithms"

5 / 5 papers shown
Title
Multiplier Bootstrap-based Exploration
Multiplier Bootstrap-based Exploration
Runzhe Wan
Haoyu Wei
Branislav Kveton
R. Song
52
3
0
03 Feb 2023
From Optimality to Robustness: Dirichlet Sampling Strategies in
  Stochastic Bandits
From Optimality to Robustness: Dirichlet Sampling Strategies in Stochastic Bandits
Dorian Baudry
Patrick Saux
Odalric-Ambrym Maillard
65
7
0
18 Nov 2021
Online Bootstrap Inference For Policy Evaluation in Reinforcement
  Learning
Online Bootstrap Inference For Policy Evaluation in Reinforcement Learning
Pratik Ramprasad
Yuantong Li
Zhuoran Yang
Zhaoran Wang
W. Sun
Guang Cheng
OffRL
137
28
0
08 Aug 2021
Sub-sampling for Efficient Non-Parametric Bandit Exploration
Sub-sampling for Efficient Non-Parametric Bandit Exploration
Dorian Baudry
E. Kaufmann
Odalric-Ambrym Maillard
33
13
0
27 Oct 2020
BanditPAM: Almost Linear Time $k$-Medoids Clustering via Multi-Armed
  Bandits
BanditPAM: Almost Linear Time kkk-Medoids Clustering via Multi-Armed Bandits
Mo Tiwari
Martin Jinye Zhang
James Mayclin
Sebastian Thrun
Chris Piech
Ilan Shomorony
56
11
0
11 Jun 2020
1