Gradient-based methods for sparse recovery
From MaRDI portal
Publication:3077129
Abstract: The convergence rate is analyzed for the SpaSRA algorithm (Sparse Reconstruction by Separable Approximation) for minimizing a sum where is smooth and is convex, but possibly nonsmooth. It is shown that if is convex, then the error in the objective function at iteration , for sufficiently large, is bounded by for suitable choices of and . Moreover, if the objective function is strongly convex, then the convergence is -linear. An improved version of the algorithm based on a cycle version of the BB iteration and an adaptive line search is given. The performance of the algorithm is investigated using applications in the areas of signal processing and image reconstruction.
Recommendations
- Gradient-based algorithms with applications to signal-recovery problems
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Nomonotone spectral gradient method for sparse recovery
- An iteratively approximated gradient projection algorithm for sparse signal reconstruction
- A gradient projection method for the sparse signal reconstruction in compressive sensing
Cited in
(41)- Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Recovery
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- Sparse recovery based on the generalized error function
- A Barzilai-Borwein type method for minimizing composite functions
- An extended projected residual algorithm for solving smooth convex optimization problems
- scientific article; zbMATH DE number 5853153 (Why is no real title available?)
- Two-point step-size iterative soft-thresholding method for sparse reconstruction
- A Barzilai-Borwein-like iterative half thresholding algorithm for the \(L_{1/2}\) regularized problem
- Gradient-based algorithms with applications to signal-recovery problems
- An active set Newton-CG method for \(\ell_1\) optimization
- Active set complexity of the away-step Frank-Wolfe algorithm
- Delayed gradient methods for symmetric and positive definite linear systems
- A relaxed-PPA contraction method for sparse signal recovery
- A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares
- An \(\mathcal O(1/{k})\) convergence rate for the variable stepsize Bregman operator splitting algorithm
- A note on the spectral gradient projection method for nonlinear monotone equations with applications
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Convergence of a Class of Nonmonotone Descent Methods for Kurdyka–Łojasiewicz Optimization Problems
- On globally Q-linear convergence of a splitting method for group Lasso
- A hybrid finite-dimensional RHC for stabilization of time-varying parabolic equations
- Sparse signal recovery based on forward backward operator splitting
- Splitting augmented Lagrangian-type algorithms with partial quadratic approximation to solve sparse signal recovery problems
- A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
- Recovering gradients from sparsely observed functional data
- Projection onto a polyhedron that exploits sparsity
- Bregman operator splitting with variable stepsize for total variation image reconstruction
- An efficient augmented Lagrangian method with applications to total variation minimization
- A new generalized shrinkage conjugate gradient method for sparse recovery
- A truncated Newton algorithm for nonconvex sparse recovery
- Nomonotone spectral gradient method for sparse recovery
- A Gradient-Enhanced L1 Approach for the Recovery of Sparse Trigonometric Polynomials
- Efficient Least Residual Greedy Algorithms for Sparse Recovery
- An active set algorithm for nonlinear optimization with polyhedral constraints
- A model of regularization parameter determination in low-dose X-ray CT reconstruction based on dictionary learning
- Gradient-based method with active set strategy for \(\ell _1\) optimization
- A fast homotopy algorithm for gridless sparse recovery
- Convergence of slice-based block coordinate descent algorithm for convolutional sparse coding
- On the rate of convergence of projected Barzilai-Borwein methods
- A new spectral method for \(l_1\)-regularized minimization
- The Moreau envelope based efficient first-order methods for sparse recovery
- A modified Newton projection method for \(\ell _1\)-regularized least squares image deblurring
This page was built for publication: Gradient-based methods for sparse recovery
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3077129)