Coordinate descent algorithms for lasso penalized regression
From MaRDI portal
Publication:2482976
Abstract: Imposition of a lasso penalty shrinks parameter estimates toward zero and performs continuous model selection. Lasso penalized regression is capable of handling linear regression problems where the number of predictors far exceeds the number of cases. This paper tests two exceptionally fast algorithms for estimating regression coefficients with a lasso penalty. The previously known algorithm is based on cyclic coordinate descent. Our new algorithm is based on greedy coordinate descent and Edgeworth's algorithm for ordinary regression. Each algorithm relies on a tuning constant that can be chosen by cross-validation. In some regression problems it is natural to group parameters and penalize parameters group by group rather than separately. If the group penalty is proportional to the Euclidean norm of the parameters of the group, then it is possible to majorize the norm and reduce parameter estimation to regression with a lasso penalty. Thus, the existing algorithm can be extended to novel settings. Each of the algorithms discussed is tested via either simulated or real data or both. The Appendix proves that a greedy form of the algorithm converges to the minimum value of the objective function.
Recommendations
- Coordinate descent algorithm for covariance graphical Lasso
- Coordinate majorization descent algorithm for nonconvex penalized regression
- A gradient descent algorithm for LASSO
- Coordinate descent algorithms
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Adaptive Randomized Coordinate Descent for Sparse Systems: Lasso and Greedy Algorithms
- A descent method for least absolute deviation Lasso problems
- Algorithms for Fitting the Constrained Lasso
- Natural coordinate descent algorithm for \(\ell_1\)-penalised regression in generalised linear models
Cited in
(only showing first 100 items - show all)- Penalized and Constrained Optimization: An Application to High-Dimensional Website Advertising
- Penalized and constrained LAD estimation in fixed and high dimension
- A data-driven line search rule for support recovery in high-dimensional data analysis
- Natural coordinate descent algorithm for \(\ell_1\)-penalised regression in generalised linear models
- Sparse matrix linear models for structured high-throughput data
- Estimation of multivariate dependence structures via constrained maximum likelihood
- Studies of the adaptive network-constrained linear regression and its application
- Model-based feature selection and clustering of RNA-seq data for unsupervised subtype discovery
- The dual and degrees of freedom of linearly constrained generalized Lasso
- Hybrid safe-strong rules for efficient optimization in Lasso-type problems
- A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Variable selection and estimation using a continuous approximation to the \(L_0\) penalty
- Sparse regression: scalable algorithms and empirical performance
- Shrinkage estimation and variable selection in multiple regression models with random coefficient autoregressive errors
- Sparse methods for automatic relevance determination
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- Gene selection and prediction for cancer classification using support vector machines with a reject option
- Stochastic block-coordinate gradient projection algorithms for submodular maximization
- Orthogonal rank-one matrix pursuit for low rank matrix completion
- Outlier detection under a covariate-adjusted exponential regression model with censored data
- Simultaneous estimation of quantile regression functions using B-splines and total variation penalty
- Confidence intervals for sparse penalized regression with random designs
- Adaptive Randomized Coordinate Descent for Sparse Systems: Lasso and Greedy Algorithms
- Linearized alternating direction method of multipliers for sparse group and fused Lasso models
- Faster subgradient methods for functions with Hölderian growth
- A novel heuristic algorithm to solve penalized regression-based clustering model
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Multiple self-controlled case series for large-scale longitudinal observational databases
- Sparse principal component regression for generalized linear models
- PUlasso: High-Dimensional Variable Selection With Presence-Only Data
- The sliding Frank-Wolfe algorithm and its application to super-resolution microscopy
- A classification-oriented dictionary learning model: explicitly learning the particularity and commonality across categories
- Convex biclustering
- A fast algorithm for the accelerated failure time model with high-dimensional time-to-event data
- Efficient LED-SAC sparse estimator using fast sequential adaptive coordinate-wise optimization (LED-2SAC)
- Coordinate descent based hierarchical interactive Lasso penalized logistic regression and its application to classification problems
- Group variable selection via convex log-exp-sum penalty with application to a breast cancer survivor study
- A statistical framework for pathway and gene identification from integrative analysis
- Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects
- Efficient computation for differential network analysis with applications to quadratic discriminant analysis
- Super-resolution for doubly-dispersive channel estimation
- Functional logistic regression: a comparison of three methods
- A new scope of penalized empirical likelihood with high-dimensional estimating equations
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Coordinate descent algorithm of generalized fused Lasso logistic regression for multivariate trend filtering
- LASSO for streaming data with adaptative filtering
- Advanced algorithms for penalized quantile and composite quantile regression
- Multivariate sparse group Lasso for the multivariate multiple linear regression with an arbitrary group structure
- An \(L_1\)-regularized logistic model for detecting short-term neuronal interactions
- A fast algorithm for detecting gene-gene interactions in genome-wide association studies
- scientific article; zbMATH DE number 6982301 (Why is no real title available?)
- Double fused Lasso penalized LAD for matrix regression
- Partial penalized empirical likelihood ratio test under sparse case
- LARS-type algorithm for group Lasso
- Coordinate majorization descent algorithm for nonconvex penalized regression
- Mixed Lasso estimator for stochastic restricted regression models
- Clustering of subsample means based on pairwise L1 regularized empirical likelihood
- Variable selection based on squared derivative averages
- Regularization for Cox's proportional hazards model with NP-dimensionality
- On the complexity analysis of randomized block-coordinate descent methods
- Selecting massive variables using an iterated conditional modes/medians algorithm
- Majorization-minimization algorithms for nonsmoothly penalized objective functions
- A gradient descent algorithm for LASSO
- A fast unified algorithm for solving group-lasso penalize learning problems
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Model selection for factorial Gaussian graphical models with an application to dynamic regulatory networks
- Standardization and the group lasso penalty
- Thresholding-based iterative selection procedures for model selection and shrinkage
- APPLE: approximate path for penalized likelihood estimators
- Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
- Pathwise coordinate optimization
- Parallel integrative learning for large-scale multi-response regression with incomplete outcomes
- Majorization minimization by coordinate descent for concave penalized generalized linear models
- Sparse directed acyclic graphs incorporating the covariates
- Group coordinate descent algorithms for nonconvex penalized regression
- Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data
- Conducting sparse feature selection on arbitrarily long phrases in text corpora with a focus on interpretability
- A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression
- Model selection for high-dimensional quadratic regression via regularization
- Efficient block-coordinate descent algorithms for the group Lasso
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- On the complexity of parallel coordinate descent
- Regularized 3D functional regression for brain image data via Haar wavelets
- New cyclic sparsity measures for deconvolution based on convex relaxation
- \(L_{1}\) penalized estimation in the Cox proportional hazards model
- An algorithm for the estimation of regularization paths of generalized linear models with group Lasso penalty
- Group penalized quantile regression
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- The group exponential Lasso for bi-level variable selection
- Sparsity with sign-coherent groups of variables via the cooperative-Lasso
- Accelerated, parallel, and proximal coordinate descent
- Variable selection and estimation in generalized linear models with the seamless \(L_0\) penalty
- Robust and sparse multigroup classification by the optimal scoring approach
- A selective review of group selection in high-dimensional models
- Penalized estimation of directed acyclic graphs from discrete data
- Structured sparsity through convex optimization
- Coordinate descent algorithm for covariance graphical Lasso
- Laplace error penalty-based variable selection in high dimension
- Inferring sparse Gaussian graphical models with latent structure
This page was built for publication: Coordinate descent algorithms for lasso penalized regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2482976)