Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2001.05699
Cited By
v1
v2 (latest)
Combining Offline Causal Inference and Online Bandit Learning for Data Driven Decision
16 January 2020
Li Ye
Yishi Lin
Hong Xie
John C. S. Lui
CML
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Combining Offline Causal Inference and Online Bandit Learning for Data Driven Decision"
6 / 6 papers shown
Title
Leveraging (Biased) Information: Multi-armed Bandits with Offline Data
Wang Chi Cheung
Lixing Lyu
OffRL
103
6
0
04 May 2024
Optimal Best-Arm Identification in Bandits with Access to Offline Data
Shubhada Agrawal
Sandeep Juneja
Karthikeyan Shanmugam
A. Suggala
109
6
0
15 Jun 2023
Evaluation Methods and Measures for Causal Learning Algorithms
Lu Cheng
Ruocheng Guo
Raha Moraffah
Paras Sheth
K. S. Candan
Huan Liu
CML
ELM
100
54
0
07 Feb 2022
Combining Online Learning and Offline Learning for Contextual Bandits with Deficient Support
Hung The Tran
Sunil R. Gupta
Thanh Nguyen-Tang
Santu Rana
Svetha Venkatesh
OffRL
57
5
0
24 Jul 2021
Bandits with Partially Observable Confounded Data
Guy Tennenholtz
Uri Shalit
Shie Mannor
Yonathan Efroni
OffRL
59
24
0
11 Jun 2020
Online Pricing with Offline Data: Phase Transition and Inverse Square Law
Jinzhi Bu
D. Simchi-Levi
Yunzong Xu
OffRL
71
33
0
19 Oct 2019
1