High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking
DOI10.1007/s11222-019-09914-9zbMath1437.62151arXiv1808.00723OpenAlexW3102310197WikidataQ90054052 ScholiaQ90054052MaRDI QIDQ2302521
Sach Mukherjee, Steven M. Hill, Fan Wang, Sylvia Richardson
Publication date: 26 February 2020
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.00723
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05)
Related Items (3)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- The Adaptive Lasso and Its Oracle Properties
- Statistics for high-dimensional data. Methods, theory and applications.
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Bayesian backfitting. (With comments and a rejoinder).
- Nonconcave penalized likelihood with a diverging number of parameters.
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Regularization in regression: comparing Bayesian and frequentist methods in a poorly informative situation
- Scalable Bayesian Regression in High Dimensions With Multiple Data Sources
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- DASSO: Connections Between the Dantzig Selector and Lasso
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Stability Selection
- Sparsity and Smoothness Via the Fused Lasso
- Consistent High-Dimensional Bayesian Variable Selection via Penalized Credible Regions
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
This page was built for publication: High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking