Optimal conditioning in the convex class of rank two updates
From MaRDI portal
Publication:4187601
DOI10.1007/BF01609030zbMATH Open0402.90086MaRDI QIDQ4187601FDOQ4187601
Authors: Robert B. Schnabel
Publication date: 1978
Published in: Mathematical Programming (Search for Journal in Brave)
Cites Work
- Variable Metric Method for Minimization
- A Rapidly Convergent Descent Method for Minimization
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- Title not available (Why is that?)
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- Optimal conditioning of self-scaling variable Metric algorithms
- Optimally conditioned optimization algorithms without line searches
- Rank-one and Rank-two Corrections to Positive Definite Matrices Expressed in Product Form
- A bound to the condition number of canonical rank-two corrections and applications to the variable metric method
- On the convergence rate of imperfect minimization algorithms in Broyden'sβ-class
Cited In (3)
This page was built for publication: Optimal conditioning in the convex class of rank two updates
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4187601)