A fast and robust unconstrained optimization method requiring minimum storage
From MaRDI portal
Publication:3693274
DOI10.1007/BF01585658zbMath0574.90073OpenAlexW1979191951MaRDI QIDQ3693274
No author found.
Publication date: 1985
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01585658
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (3)
Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints ⋮ An algorithm for solving sparse nonlinear least squares problems ⋮ Vectorization of conjugate-gradient methods for large-scale minimization in meteorology
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Note on Minimization Algorithms which make Use of Non-quardratic Properties of the Objective Function
- Technical Note—A Modified Conjugate Gradient Algorithm
- Optimally conditioned optimization algorithms without line searches
- Optimal conditioning of self-scaling variable Metric algorithms
- Matrix conditioning and nonlinear optimization
- An assessment of two approaches to variable metric methods
- A combined conjugate-gradient quasi-Newton minimization algorithm
- Restart procedures for the conjugate gradient method
- Conjugate Gradient Methods with Inexact Searches
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- A Comparison of Several Current Optimization Methods, and the use of Transformations in Constrained Problems
- A new approach to variable metric algorithms
- A Modification of Davidon's Minimization Method to Accept Difference Approximations of Derivatives
- Linear Convergence of the Conjugate Gradient Method
This page was built for publication: A fast and robust unconstrained optimization method requiring minimum storage