Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1012.3005
Cited By
On the Combinatorial Multi-Armed Bandit Problem with Markovian Rewards
14 December 2010
Yi Gai
Bhaskar Krishnamachari
M. Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the Combinatorial Multi-Armed Bandit Problem with Markovian Rewards"
2 / 2 papers shown
Title
The Non-Bayesian Restless Multi-Armed Bandit: a Case of Near-Logarithmic Regret
Wenhan Dai
Yi Gai
Bhaskar Krishnamachari
Qing Zhao
77
64
0
22 Nov 2010
Online Learning in Opportunistic Spectrum Access: A Restless Bandit Approach
Cem Tekin
M. Liu
108
106
0
01 Oct 2010
1