A modified local quadratic approximation algorithm for penalized optimization problems
From MaRDI portal
Publication:147630
DOI10.1016/j.csda.2015.08.019zbMath1468.62114OpenAlexW1478304458MaRDI QIDQ147630
Sangin Lee, Sunghoon Kwon, Yongdai Kim, Yongdai Kim, Sunghoon Kwon
Publication date: February 2016
Published in: Computational Statistics & Data Analysis, Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2015.08.019
Computational methods for problems pertaining to statistics (62-08) Ridge regression; shrinkage estimators (Lasso) (62J07)
Related Items
On the strong oracle property of concave penalized estimators with infinite penalty derivative at the origin, Penalized generalized estimating equations approach to longitudinal data with multinomial responses, Sparse pathway-based prediction models for high-throughput molecular data, Simultaneous spatial smoothing and outlier detection using penalized regression, with application to childhood obesity surveillance from electronic health records, Genetic algorithm versus classical methods in sparse index tracking, ncpen
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Monotonicity of quadratic-approximation algorithms
- One-step sparse estimates in nonconcave penalized likelihood models
- Multiclass sparse logistic regression for classification of multiple cancer types using gene expression data
- Least angle regression. (With discussion)
- Pathwise coordinate optimization
- Piecewise linear regularized solution paths
- Atomic Decomposition by Basis Pursuit
- The Concave-Convex Procedure
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data
- A new approach to variable selection in least squares problems
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Smoothly Clipped Absolute Deviation on High Dimensions
- Variable Selection and Model Building via Likelihood Basis Pursuit
- Structural modelling with sparse kernels