## Approximation by log-concave distributions, with applications to regression.(English)Zbl 1216.62023

Summary: We study the approximation of arbitrary distributions $$P$$ on $$d$$-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback-Leibler-type functional. We show that such an approximation exists if and only if $$P$$ has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on $$P$$ with respect to C. L. Mallows distance $$D_{1}(\cdot , \cdot )$$ [Ann. Math. Stat. 43, 508–515 (1972; Zbl 0238.60017)]. This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response $$Y=\mu (X)+\varepsilon$$, where $$X$$ and $$\varepsilon$$ are independent, $$\mu (\cdot )$$ belongs to a certain class of regression functions while $$\varepsilon$$ is a random error with log-concave density and mean zero.

### MSC:

 62E17 Approximations to statistical distributions (nonasymptotic) 62G07 Density estimation 62G08 Nonparametric regression and quantile regression 62H12 Estimation in multivariate analysis 62J05 Linear regression; mixed models

Zbl 0238.60017