High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
From MaRDI portal
Publication:2259726
DOI10.1007/s00180-013-0436-3zbMath1306.65035OpenAlexW2037366835MaRDI QIDQ2259726
Jacopo Mandozzi, Peter Bühlmann
Publication date: 5 March 2015
Published in: Computational Statistics (Search for Journal in Brave)
Full work available at URL: http://doc.rero.ch/record/325774/files/180_2013_Article_436.pdf
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05)
Related Items
Stable prediction in high-dimensional linear models, Scalable inference for high-dimensional precision matrix, PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection, Statistical inference for model parameters in stochastic gradient descent, Impacts of high dimensionality in finite samples, Hierarchical inference for genome-wide association studies: a view on methodology with software, Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation, Cellwise outlier detection with false discovery rate control, A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models, Structure learning of exponential family graphical model with false discovery rate control, Post hoc confidence bounds on false positives using reference families, Inference for sparse linear regression based on the leave-one-covariate-out solution path, Cluster feature selection in high-dimensional linear models, Sparse matrices in data analysis, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, False Discovery Rate Control Under General Dependence By Symmetrized Data Aggregation, Variable selection after screening: with or without data splitting?, Pruning variable selection ensembles, High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking, Bi-level variable selection via adaptive sparse group Lasso, Sparse nonparametric model for regression with functional covariate, On Lasso refitting strategies
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Statistical significance in high-dimensional linear models
- Estimation in high-dimensional linear models with deterministic design matrices
- Statistics for high-dimensional data. Methods, theory and applications.
- The Dantzig selector and sparsity oracle inequalities
- High-dimensional variable selection
- Sparsity in penalized empirical risk minimization
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- On the conditions used to prove oracle results for the Lasso
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- p-Values for High-Dimensional Regression
- Scaled sparse linear regression
- Sufficient dimension reduction and prediction in regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net