Global rates of convergence in log-concave density estimation

The estimation of a log-concave density on represents a central problem in the area of nonparametric inference under shape constraints. In this paper, we study the performance of log-concave density estimators with respect to global (e.g. squared Hellinger) loss functions, and adopt a minimax approach. We first show that no statistical procedure based on a sample of size can estimate a log-concave density with supremum risk smaller than order , when , and order when . In particular, this reveals a sense in which, when , log-concave density estimation is fundamentally more challenging than the estimation of a density with two bounded derivatives (a problem to which it has been compared). Second, we show that the Hellinger -bracketing entropy of a class of log-concave densities with small mean and covariance matrix close to the identity grows like as . This enables us to obtain rates of convergence for the supremum squared Hellinger risk of the log-concave maximum likelihood estimator of when (the minimax optimal rate), when , and when .
View on arXiv