scientific article; zbMATH DE number 7814804
From MaRDI portal
Publication:6191840
DOI10.22124/jmm.2023.23425.2089MaRDI QIDQ6191840
Farzad Rahpeymaii, Saeed Nezhadhosein, Unnamed Author
Publication date: 11 March 2024
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Cites Work
- A modified scaling parameter for the memoryless BFGS updating formula
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A descent spectral conjugate gradient method for impulse noise removal
- New quasi-Newton equation and related methods for unconstrained optimization
- Modifying the BFGS method
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- A new family of conjugate gradient methods for unconstrained optimization
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing
- Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing
- A hybrid quasi-Newton method with application in sparse recovery
- Diagonal approximation of the Hessian by finite differences for unconstrained optimization
- New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- A descent extension of a modified Polak-Ribière-Polyak method with application in image restoration problem
- Sizing and Least-Change Secant Methods
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Some Superlinear Convergence Results for the Conjugate Gradient Method
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- Quasi-Newton Methods, Motivation and Theory
- Numerical Optimization
- An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A derivative‐free scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for solving a system of monotone nonlinear equations
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A Class of Methods for Solving Nonlinear Simultaneous Equations
- CUTEr and SifDec
- The Convergence of a Class of Double-rank Minimization Algorithms
- New nonlinear conjugate gradient methods based on optimal Dai-Liao parameters
- A modified BFGS method and its global convergence in nonconvex minimization
- Benchmarking optimization software with performance profiles.
- Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization
This page was built for publication: