A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
From MaRDI portal
Publication:4559412
Recommendations
- A diagonal quasi-Newton updating method for unconstrained optimization
- A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Nonmonotone generalized diagonal quasi-Newton algorithm
- Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm
Cites work
- scientific article; zbMATH DE number 3928227 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 1206370 (Why is no real title available?)
- scientific article; zbMATH DE number 6270804 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A New Variational Result for Quasi-Newton Formulae
- A Rapidly Convergent Descent Method for Minimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A modified nonmonotone BFGS algorithm for solving smooth nonlinear equations
- A modified scaling parameter for the memoryless BFGS updating formula
- A monotone gradient method via weak secant equation for unconstrained optimization
- A new approach to variable metric algorithms
- A new gradient method via quasi-Cauchy relation which guarantees descent
- A new three-term conjugate gradient algorithm for unconstrained optimization
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- An extended nonmonotone line search technique for large-scale unconstrained optimization
- An unconstrained optimization test functions collection
- Benchmarking optimization software with performance profiles.
- Conditioning of Quasi-Newton Methods for Function Minimization
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- Improved Hessian approximation with modified quasi-Cauchy relation for gradient-type method
- Linear and nonlinear programming
- New BFGS method for unconstrained optimization problem based on modified armijo line search
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Sizing and Least-Change Secant Methods
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- The Quasi-Cauchy Relation and Diagonal Updating
- Two-Point Step Size Gradient Methods
Cited in
(14)- Diagonal BFGS updates and applications to the limited memory BFGS method
- Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing
- Nonmonotone generalized diagonal quasi-Newton algorithm
- A diagonal quasi-Newton updating method for unconstrained optimization
- A new diagonal quasi-Newton algorithm for unconstrained optimization problems.
- A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM
- scientific article; zbMATH DE number 1193038 (Why is no real title available?)
- A new diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Diagonal approximation of the Hessian by finite differences for unconstrained optimization
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization
- Quasi-Newton type of diagonal updating for the L-BFGS method
- Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model
- Diagonal quasi-Newton method via variational principle under generalized Frobenius norm
This page was built for publication: A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4559412)