Analyzing Approximate Value Iteration Algorithms
In this paper, we consider the stochastic iterative counterpart of the value iteration scheme wherein only noisy and possibly biased approximations of the Bellman operator are available. We call the aforementioned counterpart as the approximate value iteration (AVI) algorithm. The structure of AVI accounts for implementations with biased function approximations of the Bellman operator and sampling errors. This is pertinent since value iteration is combined with neural networks, which are used to approximate the Bellman operator, to solve complex problems that are susceptible to Bellman's curse of dimensionality. Further, instead of taking an expectation to calculate the Bellman operator, one generally uses samples. We present verifiable sufficient conditions under which AVI is stable (almost surely bounded) and converges to a fixed point of the approximate Bellman operator. We show that AVI can also be used in more general circumstances, i.e., for finding fixed points of contractive set-valued maps. To ensure the stability of AVI, we present three different yet related set of sufficient conditions that are based on the existence of an appropriate Lyapunov function. These Lyapunov function based conditions are easily verifiable and new to the literature. The verifiability is enhanced by the fact that a recipe for the construction of the necessary Lyapunov function is also provided. Finally, we show that the stability analysis of AVI can be readily extended to the general case of set-valued stochastic approximations.
View on arXiv