Stability of Huang's update for the conjugate gradient method
From MaRDI portal
Publication:2554542
Cites work
- A Rapidly Convergent Descent Method for Minimization
- A new approach to variable metric algorithms
- Computational experience with quadratically convergent minimisation methods
- Parameter Selection for Modified Newton Methods for Function Minimization
- Quasi-Newton Methods and their Application to Function Minimisation
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Unified approach to quadratically convergent algorithms for function minimization
- Variable Metric Method for Minimization
- Variations on Variable-Metric Methods
Cited in
(5)- Global convergence property of scaled two-step BFGS method
- Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- Extra updates for the bfgs method∗
- Variable metric methods for unconstrained optimization and nonlinear least squares
This page was built for publication: Stability of Huang's update for the conjugate gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2554542)