Discussion: One-step sparse estimates in nonconcave penalized likelihood models
From MaRDI portal
Publication:939651
DOI10.1214/07-AOS0316AzbMath1282.62096arXiv0808.1013OpenAlexW2025888759MaRDI QIDQ939651
Publication date: 28 August 2008
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0808.1013
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic properties of nonparametric inference (62G20) Linear regression; mixed models (62J05) Monte Carlo methods (65C05)
Related Items
Variable selection in partial linear regression with functional covariate, The use of random-effect models for high-dimensional variable selection problems, High-dimensional sparse index tracking based on a multi-step convex optimization approach, Consistent group selection in high-dimensional linear regression, Partial correlation graphical LASSO, Nonconcave penalized composite conditional likelihood estimation of sparse Ising models, Least angle and \(\ell _{1}\) penalized regression: a review, Variable selection using penalized empirical likelihood, Nonbifurcating Phylogenetic Tree Inference via the Adaptive LASSO, General Sparse Boosting: Improving Feature Selection of L2Boosting by Correlation-Based Penalty Family
Cites Work
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Relaxed Lasso
- Heuristics of instability and stabilization in model selection
- Least angle regression. (With discussion)
- High-dimensional generalized linear models and the lasso
- High-dimensional graphs and variable selection with the Lasso
- The Group Lasso for Logistic Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A new approach to variable selection in least squares problems