Finite-Time Analysis of Asynchronous Stochastic Approximation and
-Learning
Annual Conference Computational Learning Theory (COLT), 2020
Adam Wierman
Abstract
We consider a general asynchronous Stochastic Approximation (SA) scheme featuring a weighted infinity-norm contractive operator, and prove a bound on its finite-time convergence rate on a single trajectory. Additionally, we specialize the result to asynchronous -learning. The resulting bound matches the sharpest available bound for synchronous -learning, and improves over previous known bounds for asynchronous -learning.
View on arXivComments on this paper
