Stability Selection
From MaRDI portal
Publication:4632639
DOI10.1111/j.1467-9868.2010.00740.xzbMath1411.62142OpenAlexW2562162676WikidataQ114873406 ScholiaQ114873406MaRDI QIDQ4632639
Nicolai Meinshausen, Peter Bühlmann
Publication date: 30 April 2019
Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/j.1467-9868.2010.00740.x
Estimation in multivariate analysis (62H12) Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic properties of nonparametric inference (62G20)
Related Items
Stable prediction in high-dimensional linear models, A novel bagging approach for variable ranking and selection via a mixed importance measure, A model averaging approach for the ordered probit and nested logit models with applications, An empirical threshold of selection probability for analysis of high-dimensional correlated data, Stability enhanced variable selection for a semiparametric model with flexible missingness mechanism and its application to the ChAMP study, Stability of feature selection in classification issues for high-dimensional correlated data, A unified framework of constrained regression, An ensemble learning method for variable selection: application to high-dimensional data and missing values, Short question-answers assessment using lexical and semantic similarity based features, Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles, Unnamed Item, Projection-based Inference for High-dimensional Linear Models, Robust gene–environment interaction analysis using penalized trimmed regression, Causal statistical inference in high dimensions, Variable selection – A review and recommendations for the practicing statistician, Unnamed Item, Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization, IPF-LASSO: integrative \(L_1\)-penalized regression with penalty factors for prediction based on multi-omics data, Causal Interaction in Factorial Experiments: Application to Conjoint Analysis, Composite quantile regression for massive datasets, AIC for the non-concave penalized likelihood method, Sparse wavelet estimation in quantile regression with multiple functional predictors, Time series graphical Lasso and sparse VAR estimation, Model-based regression clustering for high-dimensional data: application to functional data, Robust sure independence screening for nonpolynomial dimensional generalized linear models, Learning Gaussian graphical models with latent confounders, Robust subtractive stability measures for fast and exhaustive feature importance ranking and selection in generalised linear models, Utilizing stability criteria in choosing feature selection methods yields reproducible results in microbiome data, Subset Selection for Linear Mixed Models, A structured brain‐wide and genome‐wide association study using ADNI PET images, A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates, Robust and sparse learning of varying coefficient models with high-dimensional features, Regularized Estimation in High-Dimensional Vector Auto-Regressive Models Using Spatio-Temporal Information, On the selection of predictors by using greedy algorithms and information theoretic criteria, Global debiased DC estimations for biased estimators via pro forma regression, Genetic Underpinnings of Brain Structural Connectome for Young Adults, Deconfounding and Causal Regularisation for Stability and External Validity, Heuristic methods for stock selection and allocation in an index tracking problem, Optimal estimation of direction in regression models with large number of parameters, Is there a role for statistics in artificial intelligence?, Scalable and efficient inference via CPE, Inference for High-Dimensional Linear Mixed-Effects Models: A Quasi-Likelihood Approach, Subsampling based variable selection for generalized linear models, An ensemble EM algorithm for Bayesian variable selection, Unnamed Item, Blessing of massive scale: spatial graphical model estimation with a total cardinality constraint approach, Compressed spectral screening for large-scale differential correlation analysis with application in selecting glioblastoma gene modules, Unnamed Item, Goodness-of-Fit Tests for High Dimensional Linear Models, Distributed Bayesian posterior voting strategy for massive data, Autoregressive models for gene regulatory network inference: sparsity, stability and causality issues, Unnamed Item, Sparse learning of partial differential equations with structured dictionary matrix, Bayesian networks for sex-related homicides: structure learning and prediction, RandGA: injecting randomness into parallel genetic algorithm for variable selection, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, Ranking-Based Variable Selection for high-dimensional data, Stochastic correlation coefficient ensembles for variable selection, A general framework for functional regression modelling, Cross-Validation With Confidence, Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model, Debiased Inference on Treatment Effect in a High-Dimensional Model, ESTIMATION FOR THE PREDICTION OF POINT PROCESSES WITH MANY COVARIATES, Likelihood Ratio Test in Multivariate Linear Regression: from Low to High Dimension, Unnamed Item, Unnamed Item, Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models, Unnamed Item, Unnamed Item, Variable selection in sparse GLARMA models, Rejoinder, Discussion of ``Correlated variables in regression: clustering and sparse estimation, Handling co-dependence issues in resampling-based variable selection procedures: a simulation study, Change points in heavy‐tailed multivariate time series: Methods using precision matrices, Likelihood adaptively modified penalties, Composite large margin classifiers with latent subclasses for heterogeneous biomedical data, Pruning variable selection ensembles, The functional linear array model, Robust sparse regression and tuning parameter selection via the efficient bootstrap information criteria, High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso, Unnamed Item, Selection of the Regularization Parameter in Graphical Models Using Network Characteristics, Functional Graphical Models, Graphical Model Selection for Gaussian Conditional Random Fields in the Presence of Latent Variables, Unnamed Item, Detection of differential item functioning in Rasch models by boosting techniques, Bayesian Regression With Undirected Network Predictors With an Application to Brain Connectome Data, Unnamed Item, Unnamed Item, A semiparametric graphical modelling approach for large-scale equity selection, Shrinkage priors for Bayesian penalized regression, Sparsity Oriented Importance Learning for High-Dimensional Linear Regression, Veridical data science, Benchmark and Survey of Automated Machine Learning Frameworks, Semi-analytic approximate stability selection for correlated data in generalized linear models, Unnamed Item, Unnamed Item, Taylor quasi-likelihood for limited generalized linear models, High-dimensional statistical inference via DATE, Stabilizing the Lasso against cross-validation variability, Influence measures and stability for graphical models, Detecting weak signals in high dimensions, Analysis of presence-only data via semi-supervised learning approaches, A review on instance ranking problems in statistical learning, The cluster graphical Lasso for improved estimation of Gaussian graphical models, Tuning-free ridge estimators for high-dimensional generalized linear models, Model selection consistency of Lasso for empirical data, Detecting possibly frequent change-points: wild binary segmentation 2 and steepest-drop model selection -- rejoinder, Variable selection for survival data with a class of adaptive elastic net techniques, A unified theory of confidence regions and testing for high-dimensional estimating equations, New hard-thresholding rules based on data splitting in high-dimensional imbalanced classification, Aggregated hold out for sparse linear regression with a robust loss function, Post-model-selection inference in linear regression models: an integrated review, Estimating finite mixtures of ordinal graphical models, Biclustering via structured regularized matrix decomposition, The use of vector bootstrapping to improve variable selection precision in Lasso models, Statistics for big data: a perspective, Graphical models via joint quantile regression with component selection, Stability orthogonal regression for system identification, PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection, Nonlinear predictive directions in clinical trials, Mean and quantile boosting for partially linear additive models, Extensions of stability selection using subsamples of observations and covariates, Robust stability best subset selection for autocorrelated data based on robust location and dispersion estimator, Detection of influential points as a byproduct of resampling-based variable selection procedures, Improving cross-validated bandwidth selection using subsampling-extrapolation techniques, Probing for sparse and fast variable selection with model-based boosting, An update on statistical boosting in biomedicine, A multicriteria approach to find predictive and sparse models with stable feature selection for high-dimensional data, Gene set priorization guided by regulatory networks with p-values through kernel mixed model, Grouped feature importance and combined features effect plot, Hierarchical inference for genome-wide association studies: a view on methodology with software, Rejoinder on: ``Hierarchical inference for genome-wide association studies: a view on methodology with software, Primal path algorithm for compositional data analysis, Characterization of weighted quantile sum regression for highly correlated data in a risk analysis setting, Debiasing the debiased Lasso with bootstrap, High-dimensional joint estimation of multiple directed Gaussian graphical models, Classifier variability: accounting for training and testing, Torus graphs for multivariate phase coupling analysis, Bayesian variable selection for survival data using inverse moment priors, High-dimensional simultaneous inference with the bootstrap, Stability selection for Lasso, ridge and elastic net implemented with AFT models, Identification of supervised and sparse functional genomic pathways, Sparse probit linear mixed model, Absolute penalty and shrinkage estimation in partially linear models, Boosting flexible functional regression models with a high number of functional historical effects, Gradient boosting for distributional regression: faster tuning and improved variable selection via noncyclical updates, Data-driven discovery of PDEs in complex datasets, Invariance, causality and robustness, Variable selection techniques after multiple imputation in high-dimensional data, A discussion on practical considerations with sparse regression methodologies, Stability approach to selecting the number of principal components, Modified SCAD penalty for constrained variable selection problems, Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model, iPHLoc-ES: identification of bacteriophage protein locations using evolutionary and structural features, Debiasing the Lasso: optimal sample size for Gaussian designs, High-dimensional consistency in score-based and hybrid structure learning, Selective inference with a randomized response, Optimal estimation of slope vector in high-dimensional linear transformation models, All Models are Wrong, but Many are Useful: Learning a Variable's Importance by Studying an Entire Class of Prediction Models Simultaneously, Correlation and variable importance in random forests, Regularized joint estimation of related vector autoregressive models, Variable selection and model choice in structured survival models, Efficient regularized regression with \(L_0\) penalty for variable selection and network construction, Comments on ``Data science, big data and statistics, Prediction error bounds for linear regression with the TREX, Stabilizing Variable Selection and Regression, On factor models with random missing: EM estimation, inference, and cross validation, Multi Split Conformal Prediction, Distribution-Free Predictive Inference For Regression, Testing conditional independence in supervised learning algorithms, Selection by partitioning the solution paths, Kernel Knockoffs Selection for Nonparametric Additive Models, Imputation and post-selection inference in models with missing data: an application to colorectal cancer surveillance guidelines, Network classification with applications to brain connectomics, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, Estimation and optimal structure selection of high-dimensional Toeplitz covariance matrix, Factor-adjusted multiple testing of correlations, Ensemble Binary Segmentation for irregularly spaced data with change-points, On Bayesian new edge prediction and anomaly detection in computer networks, Stable graphical model estimation with random forests for discrete, continuous, and mixed variables, Variable selection with Hamming loss, Learning stable and predictive structures in kinetic systems: Benefits of a causal approach, High-dimensional variable selection via low-dimensional adaptive learning, Penalized differential pathway analysis of integrative oncogenomics studies, Regularization methods for high-dimensional sparse control function models, High-dimensional regression in practice: an empirical study of finite-sample prediction, variable selection and ranking, Pseudo estimation and variable selection in regression, Sparse regression for extreme values, Determining the number of factors in approximate factor models by twice K-fold cross validation, A variable selection approach in the multivariate linear model: an application to LC-MS metabolomics data, False discovery control for penalized variable selections with high-dimensional covariates, Feature selection for data integration with mixed multiview data, Spatially relaxed inference on high-dimensional linear models, Spectral analysis of high-dimensional time series, Adaptive step-length selection in gradient boosting for Gaussian location and scale models, Visualization and assessment of model selection uncertainty, Revisiting feature selection for linear models with FDR and power guarantees, False Discovery Rate Control via Data Splitting, Distributionally robust and generalizable inference, On linear models for discrete operator inference in time dependent problems, Derandomizing Knockoffs, Quantitative robustness of instance ranking problems, Subbotin graphical models for extreme value dependencies with applications to functional neuronal connectivity, Stability Approach to Regularization Selection for Reduced-Rank Regression, Modeling Postoperative Mortality in Older Patients by Boosting Discrete-Time Competing Risks Models, Robust Signal Recovery for High-Dimensional Linear Log-Contrast Models with Compositional Covariates
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- The Adaptive Lasso and Its Oracle Properties
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Consensus clustering: A resampling-based method for class discovery and visualization of gene expression microarray data
- Multiple hypothesis testing in microarray experiments.
- Analyzing bagging
- Optimal predictive model selection.
- Least angle regression. (With discussion)
- Sparse permutation invariant covariance estimation
- Weak greedy algorithms
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Pathwise coordinate optimization
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- Atomic Decomposition by Basis Pursuit
- Comparing the Characteristics of Gene Expression Profiles Derived by Univariate and Multivariate Classification Methods
- Bayesian Variable Selection in Multinomial Probit Models to Identify Molecular Signatures of Disease Stage
- Model selection and estimation in the Gaussian graphical model
- Greed is Good: Algorithmic Results for Sparse Approximation
- The Group Lasso for Logistic Regression
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Matching pursuits with time-frequency dictionaries
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Random forests