461

Eluder Dimension and Generalized Rank

Neural Information Processing Systems (NeurIPS), 2021
Abstract

We study the relationship between the eluder dimension for a function class and a generalized notion of rank, defined for any monotone "activation" σ:RR\sigma : \mathbb{R} \to \mathbb{R}, which corresponds to the minimal dimension required to represent the class as a generalized linear model. When σ\sigma has derivatives bounded away from 00, it is known that σ\sigma-rank gives rise to an upper bound on eluder dimension for any function class; we show however that eluder dimension can be exponentially smaller than σ\sigma-rank. We also show that the condition on the derivative is necessary; namely, when σ\sigma is the relu\mathrm{relu} activation, we show that eluder dimension can be exponentially larger than σ\sigma-rank.

View on arXiv
Comments on this paper