Approximation by log-concave distributions, with applications to regression

From MaRDI portal
Publication:548533

DOI10.1214/10-AOS853zbMATH Open1216.62023arXiv1002.3448OpenAlexW3100270455MaRDI QIDQ548533FDOQ548533


Authors: Lutz Dümbgen, Richard Samworth, Dominic Schuhmacher Edit this on Wikidata


Publication date: 29 June 2011

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: We study the approximation of arbitrary distributions P on d-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback--Leibler-type functional. We show that such an approximation exists if and only if P has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on P with respect to Mallows distance D1(cdot,cdot). This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response Y=mu(X)+epsilon, where X and epsilon are independent, mu(cdot) belongs to a certain class of regression functions while epsilon is a random error with log-concave density and mean zero.


Full work available at URL: https://arxiv.org/abs/1002.3448




Recommendations




Cites Work


Cited In (45)

Uses Software





This page was built for publication: Approximation by log-concave distributions, with applications to regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548533)