High-dimensional variable selection via low-dimensional adaptive learning
From MaRDI portal
Publication:2044323
DOI10.1214/21-EJS1797zbMath1471.62557arXiv1905.00105OpenAlexW2942689108MaRDI QIDQ2044323
Maria Kateri, Christian Staerk, Ioannis Ntzoufras
Publication date: 9 August 2021
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1905.00105
subset selectionhigh-dimensional datasparsitystability selectionextended Bayesian information criterion
Learning and adaptive systems in artificial intelligence (68T05) Statistical ranking and selection procedures (62F07) Statistical aspects of information-theoretic topics (62B10) Statistical aspects of big data and data science (62R07)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Variable selection in high-dimensional linear models: partially faithful distributions and the PC-simple algorithm
- The Adaptive Lasso and Its Oracle Properties
- Best subset selection via a modern optimization lens
- Extensions of stability selection using subsamples of observations and covariates
- Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem
- When do stepwise algorithms meet subset selection criteria?
- Tournament screening cum EBIC for feature selection with high-dimensional feature spaces
- Estimating the dimension of a model
- Heuristics of instability and stabilization in model selection
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Maximum likelihood principle and model selection when the true model is unspecified
- Least angle regression. (With discussion)
- Extended BIC for linear regression models with diverging number of relevant features and high or ultra-high feature spaces
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Least squares after model selection in high-dimensional sparse models
- Pathwise coordinate optimization
- On the expectation of the maximum of IID geometric random variables
- High-dimensional graphs and variable selection with the Lasso
- Description of the Minimizers of Least Squares Regularized with $\ell_0$-norm. Uniqueness of the Global Minimizer
- Extended BIC for small-n-large-P sparse GLM
- Extended Bayesian information criteria for model selection with large model spaces
- Regressions by Leaps and Bounds
- A Branch and Bound Algorithm for Feature Subset Selection
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Selection bias in gene extraction on the basis of microarray gene-expression data
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Stability Selection
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- High Dimensional Variable Selection via Tilting
- Variable Selection with Error Control: Another Look at Stability Selection
- The restricted consistency property of leave-nv-out cross-validation for high-dimensional variable selection
- Linear Model Selection by Cross-Validation
- Shotgun Stochastic Search for “Largep” Regression
- Regularization and Variable Selection Via the Elastic Net
- A Split-and-Merge Bayesian Variable Selection Approach for Ultrahigh Dimensional Regression
- Model Selection and Estimation in Regression with Grouped Variables
- A Sharper Form of the Borel-Cantelli Lemma and the Strong Law
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- Model Selection Principles in Misspecified Models
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- A new look at the statistical model identification