Gradient projection Newton pursuit for sparsity constrained optimization
From MaRDI portal
Publication:2168680
Abstract: Hard-thresholding-based algorithms have seen various advantages for sparse optimization in controlling the sparsity and allowing for fast computation. Recent research shows that when techniques of the Newton-type methods are integrated, their numerical performance can be improved surprisingly. This paper develops a gradient projection Newton pursuit algorithm that mainly adopts the hard-thresholding operator and employs the Newton pursuit only when certain conditions are satisfied. The proposed algorithm is capable of converging globally and quadratically under the standard assumptions. When it comes to compressive sensing problems, the imposed assumptions are much weaker than those for many state-of-the-art algorithms. Moreover, extensive numerical experiments have demonstrated its high performance in comparison with the other leading solvers.
Recommendations
- Iterative projection gradient hard thresholding pursuit algorithm for sparse optimization
- Optimal $k$-Thresholding Algorithms for Sparse Optimization Problems
- Partial gradient optimal thresholding algorithms for a class of sparse optimization problems
- A new conjugate gradient hard thresholding pursuit algorithm for sparse signal recovery
- Sparsity constrained nonlinear optimization: optimality conditions and algorithms
Cites Work
- scientific article; zbMATH DE number 6982922 (Why is no real title available?)
- scientific article; zbMATH DE number 7370529 (Why is no real title available?)
- A null-space-based weightedl1minimization approach to compressed sensing
- A tight bound of hard thresholding
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Alternating direction algorithms for \(\ell_1\)-problems in compressive sensing
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Compressed Sensing With Nonlinear Observations and Related Nonlinear Optimization Problems
- Compressed sensing
- Compressive sensing and structured random matrices
- Computing a Trust Region Step
- Decoding by Linear Programming
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- GESPAR: Efficient Phase Retrieval of Sparse Signals
- Gradient Pursuits
- Greedy sparsity-constrained optimization
- Hard thresholding pursuit: an algorithm for compressive sensing
- Improved iteratively reweighted least squares for unconstrained smoothed \(\ell_q\) minimization
- Iterative hard thresholding for compressed sensing
- Iterative thresholding for sparse approximations
- Minimization of \(\ell_{1-2}\) for compressed sensing
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- On solutions of sparsity constrained optimization
- Probing the Pareto frontier for basis pursuit solutions
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Sparse Optimization with Least-Squares Constraints
- Sparsity constrained nonlinear optimization: optimality conditions and algorithms
- Stable signal recovery from incomplete and inaccurate measurements
- Subspace Pursuit for Compressive Sensing Signal Reconstruction
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Trading accuracy for sparsity in optimization problems with sparsity constraints
Cited In (8)
- Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications
- Scaled proximal gradient methods for sparse optimization problems
- Title not available (Why is no real title available?)
- Algorithms for sparsity-constrained optimization
- Iterative projection gradient hard thresholding pursuit algorithm for sparse optimization
- Partial gradient optimal thresholding algorithms for a class of sparse optimization problems
- A greedy Newton-type method for multiple sparse constraint problem
- Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Recovery
This page was built for publication: Gradient projection Newton pursuit for sparsity constrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2168680)