Adaptation in multivariate log-concave density estimation

We study the adaptation properties of the multivariate log-concave maximum likelihood estimator over two subclasses of log-concave densities. The first consists of densities with polyhedral support whose logarithms are piecewise affine. The complexity of such densities can be measured in terms of the sum of the numbers of facets of the subdomains in the polyhedral subdivision of the support induced by . Given independent observations from a -dimensional log-concave density with , we prove a sharp oracle inequality, which in particular implies that the Kullback--Leibler risk of the log-concave maximum likelihood estimator for such densities is bounded above by , up to a polylogarithmic factor. Thus, the rate can be essentially parametric, even in this multivariate setting. The second type of subclass consists of densities whose contours are well-separated; these new classes are constructed to be affine invariant and turn out to contain a wide variety of densities, including those that satisfy H\"older regularity conditions. Here, we prove another sharp oracle inequality, which reveals in particular that the log-concave maximum likelihood estimator attains a Kullback--Leibler risk bound of order when over the class of -H\"older log-concave densities with , again up to a polylogarithmic factor.
View on arXiv