Abstract: We consider ``one-at-a-time coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the -penalized regression (lasso) in the literature, but it seems to have been largely ignored. Indeed, it seems that coordinate-wise algorithms are not often used in convex optimization. We show that this algorithm is very competitive with the well-known LARS (or homotopy) procedure in large lasso problems, and that it can be applied to related methods such as the garotte and elastic net. It turns out that coordinate-wise descent does not work in the ``fused lasso, however, so we derive a generalized algorithm that yields the solution in much less time that a standard convex optimizer. Finally, we generalize the procedure to the two-dimensional fused lasso, and demonstrate its performance on some image smoothing problems.
Recommendations
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A new approach to variable selection in least squares problems
- Adapting to Unknown Smoothness via Wavelet Shrinkage
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Atomic Decomposition by Basis Pursuit
- Better Subset Regression Using the Nonnegative Garrote
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Least angle regression. (With discussion)
- Model Selection and Estimation in Regression with Grouped Variables
- Nonlinear total variation based noise removal algorithms
- Regularization and Variable Selection Via the Elastic Net
- Sparsity and Smoothness Via the Fused Lasso
- Spatial smoothing and hot spot detection for CGH data using the fused lasso
- The Explicit Inverse of a Tridiagonal Matrix
Cited in
(only showing first 100 items - show all)- On path restoration for censored outcomes
- Sparse-smooth regularized singular value decomposition
- Regression with outlier shrinkage
- Managing randomization in the multi-block alternating direction method of multipliers for quadratic optimization
- Model selection in linear mixed models
- On the choice of high-dimensional regression parameters in Gaussian random tomography
- Estimating time-varying networks
- On stepwise pattern recovery of the fused Lasso
- Model selection for high-dimensional quadratic regression via regularization
- Improved Pathwise Coordinate Descent for Power Penalties
- Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
- A modified adaptive Lasso for identifying interactions in the Cox model with the heredity constraint
- Learning Oncogenic Pathways from Binary Genomic Instability Data
- Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes
- Regularized 3D functional regression for brain image data via Haar wavelets
- Simultaneous estimation of quantile regression functions using B-splines and total variation penalty
- Split Bregman method for large scale fused Lasso
- Sparse estimation via nonconcave penalized likelihood in factor analysis model
- ADMM for High-Dimensional Sparse Penalized Quantile Regression
- Greedy algorithms for prediction
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
- Inferring sparse Gaussian graphical models with latent structure
- A Note on Application of Nesterov’s Method in Solving Lasso-Type Problems
- Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures
- Coordinate descent algorithm of generalized fused Lasso logistic regression for multivariate trend filtering
- Regression on manifolds: estimation of the exterior derivative
- Spectrally Sparse Nonparametric Regression via Elastic Net Regularized Smoothers
- Low-rank model with covariates for count data with missing values
- Multiple change-point detection: a selective overview
- Exponential regression for censored data with outliers
- Hybrid safe-strong rules for efficient optimization in Lasso-type problems
- Variable selection in multivariate linear models for functional data via sparse regularization
- Edge selection for undirected graphs
- A consistent and numerically efficient variable selection method for sparse Poisson regression with applications to learning and signal recovery
- A coordinate descent method for total variation minimization
- Usage of the GO estimator in high dimensional linear models
- Sparse regression: scalable algorithms and empirical performance
- Variable Selection in Regression-Based Estimation of Dynamic Treatment Regimes
- Homogeneity Pursuit in Single Index Models based Panel Data Analysis
- Primal path algorithm for compositional data analysis
- An efficient approach for discriminant analysis based on adaptive feature augmentation
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Kernel Cox partially linear regression: building predictive models for cancer patients' survival
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Joint Structural Break Detection and Parameter Estimation in High-Dimensional Nonstationary VAR Models
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Learning and estimation applications of an online homotopy algorithm for a generalization of the LASSO
- Fusion learning algorithm to combine partially heterogeneous Cox models
- Sparse permutation invariant covariance estimation
- Coordinate majorization descent algorithm for nonconvex penalized regression
- Subgroup identification via homogeneity pursuit for dense longitudinal/spatial data
- Parallel integrative learning for large-scale multi-response regression with incomplete outcomes
- Block decomposition methods for total variation by primal-dual stitching
- Penalized polygram regression
- Tensor denoising with trend filtering
- Trustworthy regularized huber regression for outlier detection
- The use of random-effect models for high-dimensional variable selection problems
- Different types of Bernstein operators in inference of Gaussian graphical model
- Sparse partial least squares regression for on‐line variable selection with multivariate data streams
- Understanding large text corpora via sparse machine learning
- Sparse estimation of multivariate Poisson log‐normal models from count data
- On the taut string interpretation and other properties of the Rudin-Osher-Fatemi model in one dimension
- Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems
- Fused-MCP With Application to Signal Processing
- A generic coordinate descent solver for non-smooth convex optimisation
- Multiscale change point inference. With discussion and authors' reply
- Sparse reduced-rank regression for simultaneous dimension reduction and variable selection
- A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression
- Robust Variable Selection With Exponential Squared Loss
- The MM alternative to EM
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- A forward and backward stagewise algorithm for nonconvex loss functions with adaptive Lasso
- On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization
- Adaptive log-density estimation
- Estimation of an oblique structure via penalized likelihood factor analysis
- Spatially adaptive binary classifier using B-splines and total variation penalty
- Fused Lasso nearly-isotonic signal approximation in general dimensions
- Generalized fused Lasso for grouped data in generalized linear models
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- On variable selection in a semiparametric AFT mixture cure model
- Regularized variational estimation for exploratory item factor analysis
- Path-following methods for maximum a posteriori estimators in Bayesian hierarchical models: how estimates depend on hyperparameters
- Sparse directed acyclic graphs incorporating the covariates
- Informative gene selection for microarray classification via adaptive elastic net with conditional mutual information
- Simultaneous analysis of Lasso and Dantzig selector
- A significance test for the lasso
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Covariate selection for accelerated failure time data
- A penalized likelihood method for structural equation modeling
- A modified information criterion for tuning parameter selection in 1d fused LASSO for inference on multiple change points
- Efficient learning and feature selection in high-dimensional regression
- Pathwise Optimality for Benchmark Tracking
- A novel regularization method for estimation and variable selection in multi-index models
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- On efficiently solving the subproblems of a level-set method for fused lasso problems
- A majorization-minimization approach to variable selection using spike and slab priors
This page was built for publication: Pathwise coordinate optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2466463)