A primal dual active set with continuation algorithm for the ^0-regularized optimization problem
From MaRDI portal
(Redirected from Publication:890462)
A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
Abstract: In this paper, we consider the problem of recovering a sparse signal based on penalized least squares formulations. We develop a novel algorithm of primal-dual active set type for a class of nonconvex sparsity-promoting penalties, including , bridge, smoothly clipped absolute deviation, capped and minimax concavity penalty. First we establish the existence of a global minimizer for the related optimization problems. Then we derive a novel necessary optimality condition for the global minimizer using the associated thresholding operator. The solutions to the optimality system are coordinate-wise minimizers, and under minor conditions, they are also local minimizers. Upon introducing the dual variable, the active set can be determined using the primal and dual variables together. Further, this relation lends itself to an iterative algorithm of active set type which at each step involves first updating the primal variable only on the active set and then updating the dual variable explicitly. When combined with a continuation strategy on the regularization parameter, the primal dual active set method is shown to converge globally to the underlying regression target under certain regularity conditions. Extensive numerical experiments with both simulated and real data demonstrate its superior performance in efficiency and accuracy compared with the existing sparse recovery methods.
Recommendations
- An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization
- Hard thresholding pursuit with continuation for \(\ell^{0}\)-regularized minimizations
- A non-convex piecewise quadratic approximation of \(\ell_0\) regularization: theory and accelerated algorithm
- Newton method for \(\ell_0\)-regularized optimization
- Proximal algorithm for minimization problems in \(l_0\)-regularization for nonlinear inverse problems
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Remark on the Restricted Isometry Property in Orthogonal Matching Pursuit
- A proximal-gradient homotopy method for the sparse least-squares problem
- A regularization parameter for nonsmooth Tikhonov regularization
- A variational approach to sparsity optimization based on Lagrange multiplier theory
- Atomic Decomposition by Basis Pursuit
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Compressed sensing
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Decoding by Linear Programming
- Description of the minimizers of least squares regularized with \(\ell_0\)-norm. Uniqueness of the global minimizer
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- From simulated annealing to stochastic continuation: a new trend in combinatorial optimization
- Gradient Pursuits
- Greed is Good: Algorithmic Results for Sparse Approximation
- Hard thresholding pursuit: an algorithm for compressive sensing
- Iterative hard thresholding for compressed sensing
- Iterative thresholding for sparse approximations
- Just relax: convex programming methods for identifying sparse signals in noise
- Lagrange Multiplier Approach to Variational Problems and Applications
- Optimization by stochastic continuation
- Optimization with sparsity-inducing penalties
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Proximal splitting methods in signal processing
- Recovery of sparse signals using OMP and its variants: convergence analysis based on RIP
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Sparse Approximation via Penalty Decomposition Methods
- Sparse Reconstruction by Separable Approximation
- Sparse Recovery With Orthogonal Matching Pursuit Under RIP
- Sparse Solution of Underdetermined Systems of Linear Equations by Stagewise Orthogonal Matching Pursuit
- Stagewise Weak Gradient Pursuits
- Subspace Pursuit for Compressive Sensing Signal Reconstruction
- The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing
- Uncertainty principles and ideal atomic decomposition
- Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit
Cited in
(41)- Newton method for \(\ell_0\)-regularized optimization
- An alternating direction method with continuation for nonconvex low rank minimization
- High-dimensional linear regression with hard thresholding regularization: theory and algorithm
- Iteratively weighted thresholding homotopy method for the sparse solution of underdetermined linear equations
- On monotone and primal-dual active set schemes for \(\ell^p\)-type problems, \(p \in (0,1]\)
- A ``nonconvex+nonconvex approach for image restoration with impulse noise removal
- Variable selection via generalized SELO-penalized linear regression models
- Tikhonov regularisation method for simultaneous inversion of the source term and initial data in a time-fractional diffusion equation
- Variable selection via generalized SELO-penalized Cox regression models
- A primal dual active set with continuation algorithm for high-dimensional nonconvex SICA-penalized regression
- Joint sparse optimization: lower-order regularization method and application in cell fate conversion
- A non-convex piecewise quadratic approximation of \(\ell_0\) regularization: theory and accelerated algorithm
- A dual active set method for \(\ell1\)-regularized problem
- Numerical solution of time-dependent component with sparse structure of source term for a time fractional diffusion equation
- A two-metric variable scaled forward-backward algorithm for \(\ell_0\) optimization problem and its applications
- Hard thresholding pursuit with continuation for \(\ell^{0}\)-regularized minimizations
- An alternating direction method of multipliers for MCP-penalized regression with high-dimensional data
- An FE-inexact heterogeneous ADMM for elliptic optimal control problems with \(L^1\)-control cost
- L0-Regularized Learning for High-Dimensional Additive Hazards Regression
- Solution sets of three sparse optimization problems for multivariate regression
- A unified primal dual active set algorithm for nonconvex sparse recovery
- Solving Elliptic Problems with Singular Sources Using Singularity Splitting Deep Ritz Method
- Imaging anisotropic conductivities from current densities
- A communication-efficient method for ℓ0 regularization linear regression models
- A data-driven line search rule for support recovery in high-dimensional data analysis
- Proximal algorithm for minimization problems in \(l_0\)-regularization for nonlinear inverse problems
- On a monotone scheme for nonconvex nonsmooth optimization with applications to fracture mechanics
- L 0 -regularization for high-dimensional regression with corrupted data
- The springback penalty for robust signal recovery
- Robust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least Squares
- Accelerated iterative hard thresholding algorithm for \(l_0\) regularized regression problem
- Smoothing Newton method for \(\ell^0\)-\(\ell^2\) regularized linear inverse problem
- An ADMM with continuation algorithm for non-convex SICA-penalized regression in high dimensions
- Sparse signal reconstruction via the approximations of \(\ell_0\) quasinorm
- Theory and fast learned solver for \(\ell^1\)-TV regularization
- Weighted thresholding homotopy method for sparsity constrained optimization
- Convergence of iterative hard-thresholding algorithm with continuation
- scientific article; zbMATH DE number 6982301 (Why is no real title available?)
- An active set Barzilar-Borwein algorithm for \(l_0\) regularized optimization
- Truncated \(L_1\) regularized linear regression: theory and algorithm
- A Primal Dual Active Set Algorithm With Continuation for Compressed Sensing
This page was built for publication: A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q890462)