Debiasing the Lasso: optimal sample size for Gaussian designs

From MaRDI portal
Publication:1991670


DOI10.1214/17-AOS1630zbMath1407.62270arXiv1508.02757MaRDI QIDQ1991670

Adel Javanmard, Andrea Montanari

Publication date: 30 October 2018

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1508.02757



Related Items

An optimal statistical and computational framework for generalized tensor estimation, Significance testing in non-sparse high-dimensional linear models, Testability of high-dimensional linear models with nonsparse structures, Semi-supervised empirical risk minimization: using unlabeled data to improve prediction, De-biasing the Lasso with degrees-of-freedom adjustment, Ridge regression revisited: debiasing, thresholding and bootstrap, Projection-based Inference for High-dimensional Linear Models, Unnamed Item, An improved algorithm for high-dimensional continuous threshold expectile model with variance heterogeneity, Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization, Predictor ranking and false discovery proportion control in high-dimensional regression, Compositional knockoff filter for high‐dimensional regression analysis of microbiome data, A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates, Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution, Inference and Estimation for Random Effects in High-Dimensional Linear Mixed Models, Debiasing the debiased Lasso with bootstrap, Detangling robustness in high dimensions: composite versus model-averaged estimation, Debiasing convex regularized estimators and interval estimation in linear models, Unnamed Item, Unnamed Item, UNIFORM-IN-SUBMODEL BOUNDS FOR LINEAR REGRESSION IN A MODEL-FREE FRAMEWORK, Forward-selected panel data approach for program evaluation, Online Debiasing for Adaptively Collected High-Dimensional Data With Applications to Time Series Analysis, Entrywise limit theorems for eigenvectors of signal-plus-noise matrix models with weak signals, Universality of regularized regression estimators in high dimensions, The Lasso with general Gaussian designs with applications to hypothesis testing, An integrated surrogate model constructing method: annealing combinable Gaussian process, Estimation and inference in sparse multivariate regression and conditional Gaussian graphical models under an unbalanced distributed setting, Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models, Variable selection in the Box-Cox power transformation model, Adaptive estimation of high-dimensional signal-to-noise ratios, The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square, Variable selection via adaptive false negative control in linear regression, Optimal sparsity testing in linear regression model, Inference without compatibility: using exponential weighting for inference on a parameter of a linear model, On rank estimators in increasing dimensions, The de-biased group Lasso estimation for varying coefficient models, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, Unnamed Item, Control variate selection for Monte Carlo integration, Spectral method and regularized MLE are both optimal for top-\(K\) ranking, Scale calibration for high-dimensional robust regression, On the asymptotic variance of the debiased Lasso, Spatially relaxed inference on high-dimensional linear models, Ensemble Kalman inversion for sparse learning of dynamical systems from time-averaged data, An \({\ell_p}\) theory of PCA and spectral clustering, Information criteria bias correction for group selection, Asymptotic normality of robust \(M\)-estimators with convex penalty


Uses Software


Cites Work