Gradient-based method with active set strategy for $\ell _1$ optimization
From MaRDI portal
Publication:4605700
DOI10.1090/mcom/3238zbMath1392.90079OpenAlexW2606445905MaRDI QIDQ4605700
Publication date: 27 February 2018
Published in: Mathematics of Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1090/mcom/3238
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08) Complexity and performance of numerical algorithms (65Y20)
Related Items
Sparse solutions to an underdetermined system of linear equations via penalized Huber loss, A truncated Newton algorithm for nonconvex sparse recovery, An active set Newton-CG method for \(\ell_1\) optimization, An inexact quasi-Newton algorithm for large-scale \(\ell_1\) optimization with box constraints, A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares, Improved SVRG for finite sum structure optimization with application to binary classification, An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization
Uses Software
Cites Work
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A second-order method for strongly convex \(\ell _1\)-regularization problems
- A new analysis on the Barzilai-Borwein gradient method
- A coordinate gradient descent method for nonsmooth separable minimization
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Linear convergence of epsilon-subgradient descent methods for a class of convex functions
- Nonmonotone spectral method for large-scale symmetric nonlinear equations
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- R-linear convergence of the Barzilai and Borwein gradient method
- A second-order method for convex1-regularized optimization with active-set prediction
- Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms
- On the convergence of an active-set method for ℓ1minimization
- A First-Order Augmented Lagrangian Method for Compressed Sensing
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- A First-Order Smoothed Penalty Method for Compressed Sensing
- Gradient-Based Methods for Sparse Recovery
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- An EM algorithm for wavelet-based image restoration
- Stable recovery of sparse overcomplete representations in the presence of noise
- Greed is Good: Algorithmic Results for Sparse Approximation
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- Two-Point Step Size Gradient Methods
- On the Identification of Active Constraints
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Sparse Reconstruction by Separable Approximation
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- A Nonmonotone Line Search Technique for Newton’s Method
- Fast Image Recovery Using Variable Splitting and Constrained Optimization
- Local Linear Convergence of the Alternating Direction Method of Multipliers on Quadratic or Linear Programs
- Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing
- The cyclic Barzilai-–Borwein method for unconstrained optimization
- Compressed sensing
- Benchmarking optimization software with performance profiles.
- Adaptive greedy approximations