A double parameter scaled BFGS method for unconstrained optimization
From MaRDI portal
Recommendations
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- An adaptive scaled BFGS method for unconstrained optimization
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
Cites work
- scientific article; zbMATH DE number 88930 (Why is no real title available?)
- scientific article; zbMATH DE number 3466802 (Why is no real title available?)
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 778130 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A Modified BFGS Algorithm for Unconstrained Optimization
- A Note on Minimization Algorithms which make Use of Non-quardratic Properties of the Objective Function
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A modified BFGS method and its global convergence in nonconvex minimization
- A new approach to variable metric algorithms
- An unconstrained optimization test functions collection
- Analysis of a self-scaling quasi-Newton method
- Automatic Column Scaling Strategies for Quasi-Newton Methods
- Benchmarking optimization software with performance profiles.
- CUTE
- Conditioning of Quasi-Newton Methods for Function Minimization
- Conjugate Gradient Methods with Inexact Searches
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Convergence Properties of the BFGS Algoritm
- Convergence analysis of a modified BFGS method on convex minimizations
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions
- Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Matrix conditioning and nonlinear optimization
- Methods of conjugate gradients for solving linear systems
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function
- Modifying the BFGS method
- Modifying the BFGS update by a new column scaling technique
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- On the Behavior of Broyden’s Class of Quasi-Newton Methods
- On the Convergence of the Variable Metric Algorithm
- Optimization theory and methods. Nonlinear programming
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Quasi-Newton Methods, Motivation and Theory
- Reduced-Hessian quasi-Newton methods for unconstrained optimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Spectral scaling BFGS method
- The BFGS method with exact line searches fails for non-convex objective functions
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
- The superlinear convergence of a modified BFGS-type method for unconstrained optimization
- Two-Point Step Size Gradient Methods
- Updating conjugate directions by the BFGS formula
- Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions
Cited in
(17)- An Alternative Scaling Factor In Broyden's Class Methods for Unconstrained Optimization
- The global convergence of the BFGS method with a modified WWP line search for nonconvex functions
- Signal recovery with constrained monotone nonlinear equations through an effective three-term conjugate gradient method
- The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique
- A conjugate direction implementation of the BFGS algorithm with automatic scaling
- Another three-term conjugate gradient method close to the memoryless BFGS for large-scale unconstrained optimization problems
- A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control
- A hybrid scaling parameter for the scaled memoryless BFGS method based on the \(\ell_\infty\) matrix norm
- An adaptive scaled BFGS method for unconstrained optimization
- A modified conjugate gradient method based on the self-scaling memoryless BFGS update
- Hypergraph-based convex semi-supervised unconstraint symmetric matrix factorization for image clustering
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A modified two-parameter scaled Broyden-type algorithm for unconstrained optimization problems
- Global convergence of a modified two-parameter scaled BFGS method with Yuan-Wei-Lu line search for unconstrained optimization
- Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- A matrix form of spectral scaling in quasi-Newton algorithm
This page was built for publication: A double parameter scaled BFGS method for unconstrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1677470)