Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity

From MaRDI portal
Publication:2012200

DOI10.1214/16-AOS1461zbMATH Open1371.62045arXiv1506.05539OpenAlexW1666335228MaRDI QIDQ2012200FDOQ2012200

Zi-Jian Guo, T. Tony Cai

Publication date: 28 July 2017

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: Confidence sets play a fundamental role in statistical inference. In this paper, we consider confidence intervals for high dimensional linear regression with random design. We first establish the convergence rates of the minimax expected length for confidence intervals in the oracle setting where the sparsity parameter is given. The focus is then on the problem of adaptation to sparsity for the construction of confidence intervals. Ideally, an adaptive confidence interval should have its length automatically adjusted to the sparsity of the unknown regression vector, while maintaining a prespecified coverage probability. It is shown that such a goal is in general not attainable, except when the sparsity parameter is restricted to a small region over which the confidence intervals have the optimal length of the usual parametric rate. It is further demonstrated that the lack of adaptivity is not due to the conservativeness of the minimax framework, but is fundamentally caused by the difficulty of learning the bias accurately.


Full work available at URL: https://arxiv.org/abs/1506.05539




Recommendations





Cited In (61)





This page was built for publication: Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2012200)