14
12

Stochastic Zeroth Order Gradient and Hessian Estimators: Variance Reduction and Refined Bias Bounds

Abstract

We study stochastic zeroth order gradient and Hessian estimators for real-valued functions in Rn\mathbb{R}^n. We show that, via taking finite difference along random orthogonal directions, the variance of the stochastic finite difference estimators can be significantly reduced. In particular, we design estimators for smooth functions such that, if one uses Θ(k) \Theta \left( k \right) random directions sampled from the Stiefel's manifold St(n,k) \text{St} (n,k) and finite-difference granularity δ\delta, the variance of the gradient estimator is bounded by O((nk1)+(n2kn)δ2+n2δ4k) \mathcal{O} \left( \left( \frac{n}{k} - 1 \right) + \left( \frac{n^2}{k} - n \right) \delta^2 + \frac{ n^2 \delta^4 }{ k } \right) , and the variance of the Hessian estimator is bounded by O((n2k21)+(n4k2n2)δ2+n4δ4k2)\mathcal{O} \left( \left( \frac{n^2}{k^2} - 1 \right) + \left( \frac{n^4}{k^2} - n^2 \right) \delta^2 + \frac{n^4 \delta^4 }{k^2} \right) . When k=nk = n, the variances become negligibly small. In addition, we provide improved bias bounds for the estimators. The bias of both gradient and Hessian estimators for smooth function ff is of order O(δ2Γ)\mathcal{O} \left( \delta^2 \Gamma \right), where δ\delta is the finite-difference granularity, and Γ \Gamma depends on high order derivatives of ff. Our results are evidenced by empirical observations.

View on arXiv
Comments on this paper