A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
From MaRDI portal
Publication:4559412
DOI10.1080/02331934.2018.1482298zbMath1402.65049OpenAlexW2809539576MaRDI QIDQ4559412
Publication date: 3 December 2018
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2018.1482298
unconstrained optimizationnumerical comparisonsmeasure function of Byrd and Nocedaldiagonal quasi-Newton updateweak secant
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items
Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing ⋮ A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM ⋮ A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization ⋮ A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization ⋮ Diagonal BFGS updates and applications to the limited memory BFGS method
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A new three-term conjugate gradient algorithm for unconstrained optimization
- A modified scaling parameter for the memoryless BFGS updating formula
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- A modified nonmonotone BFGS algorithm for solving smooth nonlinear equations
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- A monotone gradient method via weak secant equation for unconstrained optimization
- A new gradient method via quasi-Cauchy relation which guarantees descent
- An extended nonmonotone line search technique for large-scale unconstrained optimization
- Linear and nonlinear programming
- Sizing and Least-Change Secant Methods
- Two-Point Step Size Gradient Methods
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A New Variational Result for Quasi-Newton Formulae
- Self-Scaling Variable Metric (SSVM) Algorithms
- The Quasi-Cauchy Relation and Diagonal Updating
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- A Rapidly Convergent Descent Method for Minimization
- New BFGS method for unconstrained optimization problem based on modified Armijo line search
- Convergence Conditions for Ascent Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Convergence Conditions for Ascent Methods. II: Some Corrections
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
- Benchmarking optimization software with performance profiles.