22
17

A One-bit, Comparison-Based Gradient Estimator

Abstract

We study zeroth-order optimization for convex functions where we further assume that function evaluations are unavailable. Instead, one only has access to a comparison oracle\textit{comparison oracle}, which given two points xx and yy returns a single bit of information indicating which point has larger function value, f(x)f(x) or f(y)f(y). By treating the gradient as an unknown signal to be recovered, we show how one can use tools from one-bit compressed sensing to construct a robust and reliable estimator of the normalized gradient. We then propose an algorithm, coined SCOBO, that uses this estimator within a gradient descent scheme. We show that when f(x)f(x) has some low dimensional structure that can be exploited, SCOBO outperforms the state-of-the-art in terms of query complexity. Our theoretical claims are verified by extensive numerical experiments.

View on arXiv
Comments on this paper