Pathwise coordinate optimization
From MaRDI portal
Publication:2466463
DOI10.1214/07-AOAS131zbMATH Open1378.90064arXiv0708.1485MaRDI QIDQ2466463FDOQ2466463
Authors: Holger Höfling, Robert Tibshirani, Jerome H. Friedman, Trevor Hastie
Publication date: 15 January 2008
Published in: The Annals of Applied Statistics (Search for Journal in Brave)
Abstract: We consider ``one-at-a-time coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the -penalized regression (lasso) in the literature, but it seems to have been largely ignored. Indeed, it seems that coordinate-wise algorithms are not often used in convex optimization. We show that this algorithm is very competitive with the well-known LARS (or homotopy) procedure in large lasso problems, and that it can be applied to related methods such as the garotte and elastic net. It turns out that coordinate-wise descent does not work in the ``fused lasso, however, so we derive a generalized algorithm that yields the solution in much less time that a standard convex optimizer. Finally, we generalize the procedure to the two-dimensional fused lasso, and demonstrate its performance on some image smoothing problems.
Full work available at URL: https://arxiv.org/abs/0708.1485
Recommendations
Cites Work
- Nonlinear total variation based noise removal algorithms
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Adapting to Unknown Smoothness via Wavelet Shrinkage
- Atomic Decomposition by Basis Pursuit
- Sparsity and Smoothness Via the Fused Lasso
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Better Subset Regression Using the Nonnegative Garrote
- A new approach to variable selection in least squares problems
- Spatial smoothing and hot spot detection for CGH data using the fused lasso
- Title not available (Why is that?)
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- The Explicit Inverse of a Tridiagonal Matrix
Cited In (only showing first 100 items - show all)
- Sparse-smooth regularized singular value decomposition
- Regression with outlier shrinkage
- Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes
- A modified adaptive Lasso for identifying interactions in the Cox model with the heredity constraint
- Regularized 3D functional regression for brain image data via Haar wavelets
- Regression on manifolds: estimation of the exterior derivative
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- A generic coordinate descent solver for non-smooth convex optimisation
- The use of random-effect models for high-dimensional variable selection problems
- A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression
- Adaptive log-density estimation
- The MM alternative to EM
- Group coordinate descent algorithms for nonconvex penalized regression
- Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data
- Sparse regulatory networks
- Structure identification in panel data analysis
- Title not available (Why is that?)
- Sparse low-rank separated representation models for learning from data
- Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator
- The use of vector bootstrapping to improve variable selection precision in Lasso models
- Screening-based Bregman divergence estimation with NP-dimensionality
- Improved variable selection with forward-lasso adaptive shrinkage
- ParNes: A rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals
- Transductive versions of the Lasso and the Dantzig selector
- Group structure detection for a high‐dimensional panel data model
- Robust and adaptive algorithms for online portfolio selection
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- Nonstationary Modeling With Sparsity for Spatial Data via the Basis Graphical Lasso
- SPADES and mixture models
- Solution path clustering with adaptive concave penalty
- Sparse and robust normal and \(t\)-portfolios by penalized \(L_q\)-likelihood minimization
- Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee
- Block coordinate descent algorithms for large-scale sparse multiclass classification
- SICA for Cox's proportional hazards model with a diverging number of parameters
- Feature selection guided by structural information
- Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
- Penalized variable selection in competing risks regression
- Multicategory vertex discriminant analysis for high-dimensional data
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Marginalized Lasso in sparse regression
- Discussion: ``A significance test for the lasso
- Accelerated Bregman method for linearly constrained \(\ell _1-\ell _2\) minimization
- A distribution-based Lasso for a general single-index model
- Approximated penalized maximum likelihood for exploratory factor analysis: an orthogonal case
- Geometry preserving multi-task metric learning
- GAITA: a Gauss-Seidel iterative thresholding algorithm for \(\ell_q\) regularized least squares regression
- Spurious predictions with random time series: the Lasso in the context of paleoclimatic reconstructions. Discussion of: ``A statistical analysis of multiple temperature proxies: are reconstructions of surface temperatures over the last 1000 years reliable?
- High-Dimensional Sparse Additive Hazards Regression
- Variable selection and estimation in generalized linear models with the seamless \(L_0\) penalty
- High-dimensional heteroscedastic regression with an application to eQTL data analysis
- Estimation of stationary autoregressive models with the Bayesian LASSO
- Model selection in linear mixed-effect models
- Reconstructing DNA copy number by penalized estimation and imputation
- Tuning parameter selection for the adaptive LASSO in the autoregressive model
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- Managing randomization in the multi-block alternating direction method of multipliers for quadratic optimization
- Hybrid safe-strong rules for efficient optimization in Lasso-type problems
- Variable selection in multivariate linear models for functional data via sparse regularization
- Sparse regression: scalable algorithms and empirical performance
- Coordinate majorization descent algorithm for nonconvex penalized regression
- A forward and backward stagewise algorithm for nonconvex loss functions with adaptive Lasso
- Estimation of an oblique structure via penalized likelihood factor analysis
- Pathwise Optimality for Benchmark Tracking
- A penalized likelihood method for structural equation modeling
- Adaptive hybrid screening for efficient lasso optimization
- General sparse boosting: improving feature selection of \(L_{2}\) boosting by correlation-based penalty family
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Variable selection for inhomogeneous spatial point process models
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- An \(L_1\)-regularized logistic model for detecting short-term neuronal interactions
- Gap safe screening rules for sparsity enforcing penalties
- Generalized score matching for non-negative data
- Sparse latent factor regression models for genome-wide and epigenome-wide association studies
- Convex biclustering
- An Alternating Method for Cardinality-Constrained Optimization: A Computational Study for the Best Subset Selection and Sparse Portfolio Problems
- Likelihood adaptively modified penalties
- Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects
- Spatially multi-scale dynamic factor modeling via sparse estimation
- New cyclic sparsity measures for deconvolution based on convex relaxation
- Variable selection for varying-coefficient models with the sparse regularization
- Subspace clustering of high-dimensional data: a predictive approach
- Some properties of generalized fused Lasso and its applications to high dimensional data
- On globally Q-linear convergence of a splitting method for group Lasso
- Sparse principal component regression for generalized linear models
- Simultaneous feature selection and clustering based on square root optimization
- A Scalable Hierarchical Lasso for Gene–Environment Interactions
- Another look at linear programming for feature selection via methods of regularization
- Edge selection based on the geometry of dually flat spaces for Gaussian graphical models
- Mixed Lasso estimator for stochastic restricted regression models
- Variable selection and parameter estimation with the Atan regularization method
- \(l_1\) regularized multiplicative iterative path algorithm for non-negative generalized linear models
- Iteratively reweighted adaptive Lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes
- Moderately clipped Lasso
- The dual and degrees of freedom of linearly constrained generalized Lasso
Uses Software
This page was built for publication: Pathwise coordinate optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2466463)