Selection by partitioning the solution paths
DOI10.1214/18-EJS1434zbMATH Open1454.62086arXiv1606.07358OpenAlexW2963650343WikidataQ129683770 ScholiaQ129683770MaRDI QIDQ114375FDOQ114375
Authors: Yang Liu, Peng Wang, Yang Liu, Peng Wang
Publication date: 1 January 2018
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.07358
Recommendations
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Approximate penalization path for smoothly clipped absolute deviation
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- Consistent selection of tuning parameters via variable selection stability
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
Statistical ranking and selection procedures (62F07) Ridge regression; shrinkage estimators (Lasso) (62J07) Fuzziness, and linear inference and regression (62J86)
Cites Work
- Estimating the dimension of a model
- Bayesian model averaging: A tutorial. (with comments and a rejoinder).
- The Adaptive Lasso and Its Oracle Properties
- Extended Bayesian information criteria for model selection with large model spaces
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Stability Selection
- Title not available (Why is that?)
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Sparsity and Smoothness Via the Fused Lasso
- Sparse inverse covariance estimation with the graphical lasso
- Optimal predictive model selection.
- Sparsity oracle inequalities for the Lasso
- Model selection and estimation in the Gaussian graphical model
- Bayesian Model Averaging for Linear Regression Models
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Regularization parameter selections via generalized information criterion
- Random lasso
- Nonconcave penalized likelihood with a diverging number of parameters.
- Consistent selection of tuning parameters via variable selection stability
- Model selection in irregular problems: Applications to mapping quantitative trait loci
- Decoding by Linear Programming
- A note on the generalized information criterion for choice of a model
- Estimation in high-dimensional linear models with deterministic design matrices
- High Dimensional Variable Selection via Tilting
- A Model Selection Approach for the Identification of Quantitative Trait Loci in Experimental Crosses
- Likelihood-based selection and sharp parameter estimation
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- Model Selection Principles in Misspecified Models
- Variable selection in nonparametric additive models
Cited In (5)
Uses Software
This page was built for publication: Selection by partitioning the solution paths
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q114375)