A hybrid quasi-Newton method with application in sparse recovery
From MaRDI portal
Publication:2167383
DOI10.1007/s40314-022-01962-8OpenAlexW4285390746MaRDI QIDQ2167383
Zohre Aminifard, Saeide Ghafoori, Saman Babaie-Kafaki
Publication date: 25 August 2022
Published in: Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s40314-022-01962-8
Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A particle swarm-BFGS algorithm for nonlinear programming problems
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A combined class of self-scaling and modified quasi-Newton methods
- A new generalized shrinkage conjugate gradient method for sparse recovery
- Convergence analysis of a modified BFGS method on convex minimizations
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Approximation BFGS methods for nonlinear image restoration
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix
- Limited memory BFGS method based on a high-order tensor model
- Broad echo state network for multivariate time series prediction
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- Fixed-Point Continuation Applied to Compressed Sensing: Implementation and Numerical Experiments
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Algorithm 851
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions
- A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence
- CUTEr and SifDec
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
This page was built for publication: A hybrid quasi-Newton method with application in sparse recovery