A Confirmation of a Conjecture on the Feldman's Two-armed Bandit Problem
Zengjing Chen
Yiwei Lin
Jichen Zhang
Abstract
Myopic strategy is one of the most important strategies when studying bandit problems. In this paper, we consider the two-armed bandit problem proposed by Feldman. With general distributions and utility functions, we obtain a necessary and sufficient condition for the optimality of the myopic strategy. As an application, we could solve Nouiehed and Ross's conjecture for Bernoulli two-armed bandit problems that myopic strategy stochastically maximizes the number of wins.
View on arXivComments on this paper
