New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method
From MaRDI portal
Publication:2190791
DOI10.1007/S10092-020-00365-7zbMATH Open1445.90100OpenAlexW3026423558MaRDI QIDQ2190791FDOQ2190791
Authors: Neculai Andrei
Publication date: 22 June 2020
Published in: Calcolo (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10092-020-00365-7
Recommendations
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- A modified conjugate gradient method based on the self-scaling memoryless BFGS update
- Scaled conjugate gradient algorithms for unconstrained optimization
Cites Work
- Algorithm 851
- CUTE
- Benchmarking optimization software with performance profiles.
- On the limited memory BFGS method for large scale optimization
- Updating Quasi-Newton Matrices with Limited Storage
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- Acceleration of conjugate gradient algorithms for unconstrained optimization
- A new three-term conjugate gradient algorithm for unconstrained optimization
- Title not available (Why is that?)
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Convergence Conditions for Ascent Methods. II: Some Corrections
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Sizing and Least-Change Secant Methods
- A spectral conjugate gradient method for unconstrained optimization
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A modified scaling parameter for the memoryless BFGS updating formula
- Title not available (Why is that?)
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- On the rate of convergence of the preconditioned conjugate gradient method
- A New Variational Result for Quasi-Newton Formulae
- The Limited Memory Conjugate Gradient Method
- Convergence analysis of nonlinear conjugate gradient methods
- Conjugate Gradient Methods with Inexact Searches
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- Some Superlinear Convergence Results for the Conjugate Gradient Method
- New convergence results and preconditioning strategies for the conjugate gradient method
- An adaptive conjugate gradient algorithm for large-scale unconstrained optimization
- Block splittings for the conjugate gradient method
- Title not available (Why is that?)
- Analysis of a self-scaling quasi-Newton method
- Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
Cited In (6)
- A new self-scaling memoryless quasi-Newton update for unconstrained optimization
- A double parameter self-scaling memoryless BFGS method for unconstrained optimization
- A modified scaling parameter for the memoryless BFGS updating formula
- Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update
- A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- A new version of augmented self-scaling BFGS method
Uses Software
This page was built for publication: New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2190791)