Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
From MaRDI portal
Publication:2083385
DOI10.3934/jimo.2021191OpenAlexW3215142244MaRDI QIDQ2083385
Zohre Aminifard, Saman Babaie-Kafaki
Publication date: 10 October 2022
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2021191
Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Related Items (3)
Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing ⋮ A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM ⋮ A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions
Uses Software
Cites Work
- Unnamed Item
- A modified scaling parameter for the memoryless BFGS updating formula
- A particle swarm-BFGS algorithm for nonlinear programming problems
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A combined class of self-scaling and modified quasi-Newton methods
- A modified scaled conjugate gradient method with global convergence for nonconvex functions
- A new generalized shrinkage conjugate gradient method for sparse recovery
- A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- A new structured quasi-Newton algorithm using partial information on Hessian
- Approximation BFGS methods for nonlinear image restoration
- Convergence theory for the structured BFGS secant method with an application to nonlinear least squares
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- New quasi-Newton equation and related methods for unconstrained optimization
- Multi-step quasi-Newton methods for optimization
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
- Limited memory BFGS algorithm for the matrix approximation problem in Frobenius norm
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A descent cautious BFGS method for computing US-eigenvalues of symmetric complex tensors
- A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems
- Limited memory BFGS method based on a high-order tensor model
- Broad echo state network for multivariate time series prediction
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions
- A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method
- A Nonmonotone Line Search Technique for Newton’s Method
- A robust multi-batch L-BFGS method for machine learning
- A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm
- CUTEr and SifDec
- New conjugacy conditions and related nonlinear conjugate gradient methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- A survey of quasi-Newton equations and quasi-Newton methods for optimization
This page was built for publication: Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing