An improved global risk bound in concave regression

From MaRDI portal
Publication:309525

DOI10.1214/16-EJS1151zbMATH Open1349.62126arXiv1512.04658MaRDI QIDQ309525FDOQ309525


Authors: Sabyasachi Chatterjee Edit this on Wikidata


Publication date: 7 September 2016

Published in: Electronic Journal of Statistics (Search for Journal in Brave)

Abstract: A new risk bound is presented for the problem of convex/concave function estimation, using the least squares estimator. The best known risk bound, as had appeared in citet{GSvex}, scaled like log(en)n4/5 under the mean squared error loss, up to a constant factor. The authors in cite{GSvex} had conjectured that the logarithmic term may be an artifact of their proof. We show that indeed the logarithmic term is unnecessary and prove a risk bound which scales like n4/5 up to constant factors. Our proof technique has one extra peeling step than in a usual chaining type argument. Our risk bound holds in expectation as well as with high probability and also extends to the case of model misspecification, where the true function may not be concave.


Full work available at URL: https://arxiv.org/abs/1512.04658




Recommendations




Cites Work


Cited In (8)





This page was built for publication: An improved global risk bound in concave regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q309525)