Converting ADMM to a proximal gradient for efficient sparse estimation
From MaRDI portal
Publication:2103289
Abstract: In sparse estimation, such as fused lasso and convex clustering, we apply either the proximal gradient method or the alternating direction method of multipliers (ADMM) to solve the problem. It takes time to include matrix division in the former case, while an efficient method such as FISTA (fast iterative shrinkage-thresholding algorithm) has been developed in the latter case. This paper proposes a general method for converting the ADMM solution to the proximal gradient method, assuming that assumption that the derivative of the objective function is Lipschitz continuous. Then, we apply it to sparse estimation problems, such as sparse convex clustering and trend filtering, and we show by numerical experiments that we can obtain a significant improvement in terms of efficiency.
Recommendations
- Fast algorithms for sparse inverse covariance estimation
- Linear convergence of the alternating direction method of multipliers for a class of convex optimization problems
- Understanding the convergence of the alternating direction method of multipliers: theoretical and computational perspectives
- On the linear convergence of the alternating direction method of multipliers
- Linearized alternating direction method of multipliers for sparse group and fused Lasso models
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6438182 (Why is no real title available?)
- $\ell_1$ Trend Filtering
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- A new look at the statistical model identification
- A three-operator splitting scheme and its optimization applications
- Alternating proximal gradient method for convex minimization
- Applications of a Splitting Algorithm to Decomposition in Convex Programming and Variational Inequalities
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Convex Analysis
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Estimating the dimension of a model
- Model Selection and Estimation in Regression with Grouped Variables
- On the global and linear convergence of the generalized alternating direction method of multipliers
- Proximité et dualité dans un espace hilbertien
- Sparse Convex Clustering
- Sparse estimation with math and R. 100 exercises for building logic
- Sparsity and Smoothness Via the Fused Lasso
Cited in
(3)
This page was built for publication: Converting ADMM to a proximal gradient for efficient sparse estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2103289)