Stability of Huang's update for the conjugate gradient method
From MaRDI portal
Publication:2554542
DOI10.1007/BF00935660zbMATH Open0243.49013OpenAlexW2065199474MaRDI QIDQ2554542FDOQ2554542
Authors: E. Spedicato
Publication date: 1973
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00935660
Cites Work
- Variable Metric Method for Minimization
- A Rapidly Convergent Descent Method for Minimization
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Quasi-Newton Methods and their Application to Function Minimisation
- Variations on Variable-Metric Methods
- Unified approach to quadratically convergent algorithms for function minimization
- Computational experience with quadratically convergent minimisation methods
- Parameter Selection for Modified Newton Methods for Function Minimization
Cited In (5)
- Global convergence property of scaled two-step BFGS method
- Convergence analysis of the self-dual optimally conditioned ssvm method of oren-spedicato
- Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
- Extra updates for the bfgs method∗
- Variable metric methods for unconstrained optimization and nonlinear least squares
This page was built for publication: Stability of Huang's update for the conjugate gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2554542)