Gradient-based methods for sparse recovery
From MaRDI portal
Publication:3077129
Abstract: The convergence rate is analyzed for the SpaSRA algorithm (Sparse Reconstruction by Separable Approximation) for minimizing a sum where is smooth and is convex, but possibly nonsmooth. It is shown that if is convex, then the error in the objective function at iteration , for sufficiently large, is bounded by for suitable choices of and . Moreover, if the objective function is strongly convex, then the convergence is -linear. An improved version of the algorithm based on a cycle version of the BB iteration and an adaptive line search is given. The performance of the algorithm is investigated using applications in the areas of signal processing and image reconstruction.
Recommendations
- Gradient-based algorithms with applications to signal-recovery problems
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Nomonotone spectral gradient method for sparse recovery
- An iteratively approximated gradient projection algorithm for sparse signal reconstruction
- A gradient projection method for the sparse signal reconstruction in compressive sensing
Cited in
(41)- Nomonotone spectral gradient method for sparse recovery
- scientific article; zbMATH DE number 5853153 (Why is no real title available?)
- A new generalized shrinkage conjugate gradient method for sparse recovery
- An efficient augmented Lagrangian method with applications to total variation minimization
- Efficient Least Residual Greedy Algorithms for Sparse Recovery
- An active set algorithm for nonlinear optimization with polyhedral constraints
- Splitting augmented Lagrangian-type algorithms with partial quadratic approximation to solve sparse signal recovery problems
- A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
- On the rate of convergence of projected Barzilai-Borwein methods
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Projection onto a polyhedron that exploits sparsity
- A truncated Newton algorithm for nonconvex sparse recovery
- Sparse recovery based on the generalized error function
- A new spectral method for \(l_1\)-regularized minimization
- Convergence of a Class of Nonmonotone Descent Methods for Kurdyka–Łojasiewicz Optimization Problems
- A Barzilai-Borwein-like iterative half thresholding algorithm for the \(L_{1/2}\) regularized problem
- Sparse signal recovery based on forward backward operator splitting
- The Moreau envelope based efficient first-order methods for sparse recovery
- A model of regularization parameter determination in low-dose X-ray CT reconstruction based on dictionary learning
- Recovering gradients from sparsely observed functional data
- An extended projected residual algorithm for solving smooth convex optimization problems
- A Gradient-Enhanced L1 Approach for the Recovery of Sparse Trigonometric Polynomials
- On globally Q-linear convergence of a splitting method for group Lasso
- Gradient-based algorithms with applications to signal-recovery problems
- A note on the spectral gradient projection method for nonlinear monotone equations with applications
- An active set Newton-CG method for \(\ell_1\) optimization
- Delayed gradient methods for symmetric and positive definite linear systems
- A modified Newton projection method for \(\ell _1\)-regularized least squares image deblurring
- A fast homotopy algorithm for gridless sparse recovery
- A Barzilai-Borwein type method for minimizing composite functions
- A hybrid finite-dimensional RHC for stabilization of time-varying parabolic equations
- A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares
- A relaxed-PPA contraction method for sparse signal recovery
- An \(\mathcal O(1/{k})\) convergence rate for the variable stepsize Bregman operator splitting algorithm
- Bregman operator splitting with variable stepsize for total variation image reconstruction
- Convergence of slice-based block coordinate descent algorithm for convolutional sparse coding
- Active set complexity of the away-step Frank-Wolfe algorithm
- Gradient-based method with active set strategy for \(\ell _1\) optimization
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Recovery
- Two-point step-size iterative soft-thresholding method for sparse reconstruction
This page was built for publication: Gradient-based methods for sparse recovery
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3077129)