Pathwise coordinate optimization for sparse learning: algorithm and theory
From MaRDI portal
Publication:1747736
DOI10.1214/17-AOS1547zbMath1416.62413arXiv1412.7477OpenAlexW2963994662MaRDI QIDQ1747736
Publication date: 27 April 2018
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1412.7477
oracle propertyglobal linear convergencenonconvex sparse learningoptimal statistical rates of convergencepathwise coordinate optimization
Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12) Nonconvex programming, global optimization (90C26) Methods of reduced gradient type (90C52)
Related Items
Penalized Estimation of Frailty-Based Illness–Death Models for Semi-Competing Risks, Fuzzy granular convolutional classifiers, Sparse and robust estimation with ridge minimax concave penalty, Accelerate the warm-up stage in the Lasso computation via a homotopic approach, Misspecified nonconvex statistical optimization for sparse phase retrieval, Linear and nonlinear signal detection and estimation in high-dimensional nonparametric regression under weak sparsity, Model selection for inferring Gaussian graphical models, Test of significance for high-dimensional longitudinal data, Pathwise coordinate optimization for sparse learning: algorithm and theory, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, An outer-inner linearization method for non-convex and nondifferentiable composite regularization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Gradient methods for minimizing composite functions
- Optimal computational and statistical rates of convergence for sparse nonconvex learning problems
- On the complexity analysis of randomized block-coordinate descent methods
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- On the convergence of the coordinate descent method for convex differentiable minimization
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Simultaneous analysis of Lasso and Dantzig selector
- Multi-stage convex relaxation for feature selection
- Calibrating nonconvex penalized regression in ultra-high dimension
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Pathwise coordinate optimization
- High-dimensional graphs and variable selection with the Lasso
- Strong oracle optimality of folded concave penalized estimation
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- The huge Package for High-dimensional Undirected Graph Estimation in R
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers