Pathwise coordinate optimization
From MaRDI portal
Publication:2466463
DOI10.1214/07-AOAS131zbMATH Open1378.90064arXiv0708.1485MaRDI QIDQ2466463FDOQ2466463
Authors: Holger Höfling, Robert Tibshirani, Jerome H. Friedman, Trevor Hastie
Publication date: 15 January 2008
Published in: The Annals of Applied Statistics (Search for Journal in Brave)
Abstract: We consider ``one-at-a-time coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the -penalized regression (lasso) in the literature, but it seems to have been largely ignored. Indeed, it seems that coordinate-wise algorithms are not often used in convex optimization. We show that this algorithm is very competitive with the well-known LARS (or homotopy) procedure in large lasso problems, and that it can be applied to related methods such as the garotte and elastic net. It turns out that coordinate-wise descent does not work in the ``fused lasso, however, so we derive a generalized algorithm that yields the solution in much less time that a standard convex optimizer. Finally, we generalize the procedure to the two-dimensional fused lasso, and demonstrate its performance on some image smoothing problems.
Full work available at URL: https://arxiv.org/abs/0708.1485
Recommendations
Cites Work
- Nonlinear total variation based noise removal algorithms
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Adapting to Unknown Smoothness via Wavelet Shrinkage
- Atomic Decomposition by Basis Pursuit
- Sparsity and Smoothness Via the Fused Lasso
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Better Subset Regression Using the Nonnegative Garrote
- A new approach to variable selection in least squares problems
- Spatial smoothing and hot spot detection for CGH data using the fused lasso
- Title not available (Why is that?)
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- The Explicit Inverse of a Tridiagonal Matrix
Cited In (only showing first 100 items - show all)
- On the choice of high-dimensional regression parameters in Gaussian random tomography
- Spectrally Sparse Nonparametric Regression via Elastic Net Regularized Smoothers
- Multivariate sparse Laplacian shrinkage for joint estimation of two graphical structures
- Variable Selection in Regression-Based Estimation of Dynamic Treatment Regimes
- Coordinate descent algorithm of generalized fused Lasso logistic regression for multivariate trend filtering
- Joint Structural Break Detection and Parameter Estimation in High-Dimensional Nonstationary VAR Models
- Primal path algorithm for compositional data analysis
- Penalized polygram regression
- Tensor denoising with trend filtering
- Fused-MCP With Application to Signal Processing
- Informative gene selection for microarray classification via adaptive elastic net with conditional mutual information
- A novel regularization method for estimation and variable selection in multi-index models
- Sparse group lasso for multiclass functional logistic regression models
- Inference procedures for the variance gamma model and applications
- Sparse regression for large data sets with outliers
- Modified path algorithm of fused Lasso signal approximator for consistent recovery of change points
- Bayesian generalized fused Lasso modeling via NEG distribution
- Qini-based uplift regression
- Penalized estimation of threshold auto-regressive models with many components and thresholds
- Modular proximal optimization for multidimensional total-variation regularization
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- High-dimensional mean estimation via \(\ell_1\) penalized normal likelihood
- Adaptive basis expansion via \(\ell _1\) trend filtering
- SNP selection in genome-wide association studies via penalized support vector machine with MAX test
- A new approach of subgroup identification for high-dimensional longitudinal data
- Alternating direction method of multipliers for nonconvex fused regression problems
- A novel heuristic algorithm to solve penalized regression-based clustering model
- High-dimensional variable selection via low-dimensional adaptive learning
- Model-based feature selection and clustering of RNA-seq data for unsupervised subtype discovery
- Distance metric learning for graph structured data
- Efficient computation for differential network analysis with applications to quadratic discriminant analysis
- Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method
- An elastic-net penalized expectile regression with applications
- A data-driven line search rule for support recovery in high-dimensional data analysis
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Clustering of subsample means based on pairwise L1 regularized empirical likelihood
- A simple and efficient algorithm for fused lasso signal approximator with convex loss function
- Lasso meets horseshoe: a survey
- Sparse group fused Lasso for model segmentation: a hybrid approach
- Incorporating spatial structure into inclusion probabilities for Bayesian variable selection in generalized linear models with the spike-and-slab elastic net
- A Joint Fairness Model with Applications to Risk Predictions for Underrepresented Populations
- Group variable selection via \(\ell_{p,0}\) regularization and application to optimal scoring
- Generalized methods and solvers for noise removal from piecewise constant signals. I: Background theory
- Runtime guarantees for regression problems
- A guide for sparse PCA: model comparison and applications
- Robust subset selection
- Fast feature selection via streamwise procedure for massive data
- PUlasso: High-Dimensional Variable Selection With Presence-Only Data
- Robust moderately clipped LASSO for simultaneous outlier detection and variable selection
- Information criteria bias correction for group selection
- A note on rank reduction in sparse multivariate regression
- Learning Oncogenic Pathways from Binary Genomic Instability Data
- Simultaneous estimation of quantile regression functions using B-splines and total variation penalty
- A coordinate descent method for total variation minimization
- Usage of the GO estimator in high dimensional linear models
- Fusion learning algorithm to combine partially heterogeneous Cox models
- Block decomposition methods for total variation by primal-dual stitching
- On faster convergence of cyclic block coordinate descent-type methods for strongly convex minimization
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- On efficiently solving the subproblems of a level-set method for fused lasso problems
- Hyperspectral image fusion with a new hybrid regularization
- A new scope of penalized empirical likelihood with high-dimensional estimating equations
- Convex and non-convex regularization methods for spatial point processes intensity estimation
- Exact post-selection inference for the generalized Lasso path
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- Sparse principal component analysis via fractional function regularity
- Solution paths for the generalized Lasso with applications to spatially varying coefficients regression
- Partial penalized empirical likelihood ratio test under sparse case
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Shrinkage estimation and variable selection in multiple regression models with random coefficient autoregressive errors
- Estimation of ARMAX processes with noise corrupted output signal observations
- A new double-regularized regression using Liu and Lasso regularization
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Robust variable selection for finite mixture regression models
- Stable prediction in high-dimensional linear models
- Additive regression splines with total variation and non negative garrote penalties
- Cyclic coordinate-update algorithms for fixed-point problems: analysis and applications
- Outlier detection under a covariate-adjusted exponential regression model with censored data
- Variable selection in partially linear additive hazards model with grouped covariates and a diverging number of parameters
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Penalized and Constrained Optimization: An Application to High-Dimensional Website Advertising
- Hierarchically penalized additive hazards model with diverging number of parameters
- Inexact proximal stochastic second-order methods for nonconvex composite optimization
- Stochastic proximal quasi-Newton methods for non-convex composite optimization
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Improving the incoherence of a learned dictionary via rank shrinkage
- Robust network-based analysis of the associations between (epi)genetic measurements
- Generalized additive partial linear models with high-dimensional covariates
- Title not available (Why is that?)
- Dictionary learning based on nonnegative matrix factorization using parallel coordinate descent
- Logistic regression with weight grouping priors
- A doubly sparse approach for group variable selection
- Advanced algorithms for penalized quantile and composite quantile regression
- Model selection with distributed SCAD penalty
- Algorithm for overcoming the curse of dimensionality for time-dependent non-convex Hamilton-Jacobi equations arising from optimal control and differential games problems
- Paths Following Algorithm for Penalized Logistic Regression Using SCAD and MCP
- Genetic algorithm versus classical methods in sparse index tracking
- A plug-in approach to sparse and robust principal component analysis
- Variable selection and estimation using a continuous approximation to the \(L_0\) penalty
- Regression with adaptive Lasso and correlation based penalty
Uses Software
This page was built for publication: Pathwise coordinate optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2466463)