An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
From MaRDI portal
Publication:2034433
DOI10.1016/J.APNUM.2021.05.002zbMATH Open1467.65060OpenAlexW3163342783MaRDI QIDQ2034433FDOQ2034433
Authors: Zohre Aminifard, Saman Babaie-Kafaki, S. Ghafoori
Publication date: 22 June 2021
Published in: Applied Numerical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.apnum.2021.05.002
Recommendations
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing
- A restart scheme for the memoryless BFGS method
- A modified BFGS algorithm based on a hybrid secant equation
Cites Work
- CUTEr and SifDec
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- Smooth minimization of non-smooth functions
- Optimization theory and methods. Nonlinear programming
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- A new generalized shrinkage conjugate gradient method for sparse recovery
- New quasi-Newton methods for unconstrained optimization problems
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A modified BFGS method and its global convergence in nonconvex minimization
- New quasi-Newton equation and related methods for unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- A Modified BFGS Algorithm for Unconstrained Optimization
- Optimal conditioning of self-scaling variable Metric algorithms
- Convergence analysis of a modified BFGS method on convex minimizations
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique
- An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions
Cited In (7)
- A hybrid quasi-Newton method with application in sparse recovery
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- Eigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equation
- Title not available (Why is that?)
- Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing
- On the generalization of secant method and the order of convergence
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
Uses Software
This page was built for publication: An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2034433)