Cover Image for System.Linq.Enumerable+EnumerablePartition`1[System.Char]

Adaptation in multivariate log-concave density estimation

OAI: oai:purehost.bath.ac.uk:openaire_cris_publications/762213a9-9db6-47f2-89dd-2c9881a1ed3b DOI: https://doi.org/10.1214/20-AOS1950
Published by:

Abstract

We study the adaptation properties of the multivariate log-concave maximum likelihood estimator over three subclasses of log-concave densities. The first consists of densities with polyhedral support whose logarithms are piecewise affine. The complexity of such densities f can be measured in terms of the sum
Γ(f) of the numbers of facets of the subdomains in the polyhedral subdivision of the support induced by f. Given n independent observations from a d-dimensional log-concave density with d ∈ {2,3}, we prove a sharp oracle inequality, which in particular implies that the Kullback–Leibler risk of the log-concave maximum likelihood estimator for such densities is bounded above by
Γ(f)/n, up to a polylogarithmic factor. Thus, the rate can be essentially parametric, even in this multivariate setting. For the second type of adaptation, we consider densities that are bounded away from zero on a polytopal support; we show that up to polylogarithmic factors, the log-concave maximum likelihood estimator attains the rate n−4/7 when d=3, which is faster than the worst-case rate of n–¹⁄². Finally, our third type of subclass consists of densities whose contours are well separated; these new classes are constructed to be affine invariant and turn out to contain a wide variety of densities, including those that satisfy Hölder regularity conditions. Here, we prove another sharp oracle inequality, which reveals in particular that the log-concave maximum likelihood estimator attains a risk bound of order n-min(β+3β+47) when d=3 over the class of β-Hölder log-concave densities with β>1, again up to a polylogarithmic factor.