A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
DOI10.1080/02331934.2018.1482298zbMATH Open1402.65049OpenAlexW2809539576WikidataQ129724065 ScholiaQ129724065MaRDI QIDQ4559412FDOQ4559412
Authors: Neculai Andrei
Publication date: 3 December 2018
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2018.1482298
Recommendations
- A diagonal quasi-Newton updating method for unconstrained optimization
- A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Nonmonotone generalized diagonal quasi-Newton algorithm
- Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm
numerical comparisonsunconstrained optimizationmeasure function of Byrd and Nocedaldiagonal quasi-Newton updateweak secant
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Methods of quasi-Newton type (90C53)
Cites Work
- Title not available (Why is that?)
- A Rapidly Convergent Descent Method for Minimization
- Benchmarking optimization software with performance profiles.
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- Two-Point Step Size Gradient Methods
- A new three-term conjugate gradient algorithm for unconstrained optimization
- An unconstrained optimization test functions collection
- Title not available (Why is that?)
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Linear and nonlinear programming
- Title not available (Why is that?)
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Sizing and Least-Change Secant Methods
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A modified scaling parameter for the memoryless BFGS updating formula
- Self-Scaling Variable Metric (SSVM) Algorithms
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- The Quasi-Cauchy Relation and Diagonal Updating
- A New Variational Result for Quasi-Newton Formulae
- New BFGS method for unconstrained optimization problem based on modified armijo line search
- A modified nonmonotone BFGS algorithm for solving smooth nonlinear equations
- A new gradient method via quasi-Cauchy relation which guarantees descent
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- A monotone gradient method via weak secant equation for unconstrained optimization
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- An extended nonmonotone line search technique for large-scale unconstrained optimization
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- Improved Hessian approximation with modified quasi-Cauchy relation for gradient-type method
- Title not available (Why is that?)
Cited In (14)
- A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM
- A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Title not available (Why is that?)
- Quasi-Newton type of diagonal updating for the L-BFGS method
- Diagonal BFGS updates and applications to the limited memory BFGS method
- Nonmonotone generalized diagonal quasi-Newton algorithm
- A new diagonal quasi-Newton algorithm for unconstrained optimization problems.
- Diagonal approximation of the Hessian by finite differences for unconstrained optimization
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- A diagonal quasi-Newton updating method for unconstrained optimization
- A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model
- Diagonal quasi-Newton method via variational principle under generalized Frobenius norm
Uses Software
This page was built for publication: A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4559412)