Substitute derivatives in unconstrained optimization: A comparison of finite difference and response surface approximations
DOI10.1016/0305-0548(88)90018-4zbMath0643.90078OpenAlexW1980136879MaRDI QIDQ1102201
Publication date: 1988
Published in: Computers \& Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0305-0548(88)90018-4
conjugate gradient algorithmsteepest descent algorithmfinite difference approximationexperimental designsquasi-Newton algorithmaccuracy of unconstrained gradient searchfirst- order response surfacessubstitute derivatives
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Related Items (3)
Cites Work
- Unnamed Item
- Unnamed Item
- Mathematical programming and the optimization of computer simulations
- An Evaluation of Substitute Methods for Derivatives in Unconstrained Optimization
- Design and Testing of a Generalized Reduced Gradient Code for Nonlinear Programming
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
This page was built for publication: Substitute derivatives in unconstrained optimization: A comparison of finite difference and response surface approximations