A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
From MaRDI portal
Publication:6574066
DOI10.3934/JIMO.2024030MaRDI QIDQ6574066FDOQ6574066
Authors: Zohre Aminifard, Saman Babaie-Kafaki
Publication date: 18 July 2024
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Recommendations
- scientific article; zbMATH DE number 7156643
- Nomonotone spectral gradient method for sparse recovery
- The application of a supermemory gradient method for large-scale signal reconstruction problem
- Gradient-based methods for sparse recovery
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
Numerical mathematical programming methods (65K05) Convex programming (90C25) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cites Work
- Algorithm 851
- CUTEr and SifDec
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- First-order methods in optimization
- Two-Point Step Size Gradient Methods
- A coordinate gradient descent method for nonsmooth separable minimization
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- A new generalized shrinkage conjugate gradient method for sparse recovery
- Fixed-Point Continuation Applied to Compressed Sensing: Implementation and Numerical Experiments
- An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
- A Positive Barzilai–Borwein-Like Stepsize and an Extension for Symmetric Linear Systems
- A new nonmonotone line search technique for unconstrained optimization
- Convex functions
- Nonmonotone Barzilai-Borwein gradient algorithm for \(\ell_1\)-regularized nonsmooth minimization in compressive sensing
- A Barzilai-Borwein type method for minimizing composite functions
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- Adaptive scaling damped BFGS method without gradient Lipschitz continuity
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions
- An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions
This page was built for publication: A scalar Hessian estimation with a sparse nonmonotone line search technique for the sparse recovery problem
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6574066)