Gradient-based methods for sparse recovery
DOI10.1137/090775063zbMATH Open1209.90266arXiv0912.1660OpenAlexW1999627297MaRDI QIDQ3077129FDOQ3077129
Authors: William Hager, Dzung T. Phan, Hongchao Zhang
Publication date: 22 February 2011
Published in: SIAM Journal on Imaging Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0912.1660
Recommendations
- Gradient-based algorithms with applications to signal-recovery problems
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Nomonotone spectral gradient method for sparse recovery
- An iteratively approximated gradient projection algorithm for sparse signal reconstruction
- A gradient projection method for the sparse signal reconstruction in compressive sensing
denoisingsparse recoverycompressed sensingimage reconstructioniterative shrinkage thresholding algorithmnonsmooth optimizationsublinear convergencelinear convergenceBB methodnonmonotone convergencesparse reconstruction by separable approximation
Convex programming (90C25) Complexity and performance of numerical algorithms (65Y20) Large-scale problems in mathematical programming (90C06) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cited In (41)
- Title not available (Why is that?)
- Nomonotone spectral gradient method for sparse recovery
- An efficient augmented Lagrangian method with applications to total variation minimization
- A new generalized shrinkage conjugate gradient method for sparse recovery
- Efficient Least Residual Greedy Algorithms for Sparse Recovery
- Splitting augmented Lagrangian-type algorithms with partial quadratic approximation to solve sparse signal recovery problems
- A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
- An active set algorithm for nonlinear optimization with polyhedral constraints
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- On the rate of convergence of projected Barzilai-Borwein methods
- Projection onto a polyhedron that exploits sparsity
- Sparse recovery based on the generalized error function
- Convergence of a Class of Nonmonotone Descent Methods for Kurdyka–Łojasiewicz Optimization Problems
- A truncated Newton algorithm for nonconvex sparse recovery
- A new spectral method for \(l_1\)-regularized minimization
- Sparse signal recovery based on forward backward operator splitting
- The Moreau envelope based efficient first-order methods for sparse recovery
- A Barzilai-Borwein-like iterative half thresholding algorithm for the \(L_{1/2}\) regularized problem
- Recovering gradients from sparsely observed functional data
- A Gradient-Enhanced L1 Approach for the Recovery of Sparse Trigonometric Polynomials
- A model of regularization parameter determination in low-dose X-ray CT reconstruction based on dictionary learning
- An extended projected residual algorithm for solving smooth convex optimization problems
- On globally Q-linear convergence of a splitting method for group Lasso
- Gradient-based algorithms with applications to signal-recovery problems
- An active set Newton-CG method for \(\ell_1\) optimization
- A note on the spectral gradient projection method for nonlinear monotone equations with applications
- Delayed gradient methods for symmetric and positive definite linear systems
- A modified Newton projection method for \(\ell _1\)-regularized least squares image deblurring
- A fast homotopy algorithm for gridless sparse recovery
- A Barzilai-Borwein type method for minimizing composite functions
- A hybrid finite-dimensional RHC for stabilization of time-varying parabolic equations
- A relaxed-PPA contraction method for sparse signal recovery
- A preconditioned conjugate gradient method with active set strategy for \(\ell_1\)-regularized least squares
- An \(\mathcal O(1/{k})\) convergence rate for the variable stepsize Bregman operator splitting algorithm
- Bregman operator splitting with variable stepsize for total variation image reconstruction
- Active set complexity of the away-step Frank-Wolfe algorithm
- Gradient-based method with active set strategy for \(\ell _1\) optimization
- Convergence of slice-based block coordinate descent algorithm for convolutional sparse coding
- A fast conjugate gradient algorithm with active set prediction for ℓ1 optimization
- Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Recovery
- Two-point step-size iterative soft-thresholding method for sparse reconstruction
This page was built for publication: Gradient-based methods for sparse recovery
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3077129)