A curvilinear search algorithm for unconstrained optimization by automatic differentiation
From MaRDI portal
Publication:2770192
DOI10.1080/10556780108805822zbMath1103.90399OpenAlexW2014149361MaRDI QIDQ2770192
No author found.
Publication date: 2001
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556780108805822
Automatic differentiationUnconstrained optimizationCurvilinear trajectoryIll-conditioned and badly scaled problems
Related Items (2)
Using negative curvature in solving nonlinear programs ⋮ On the final steps of Newton and higher order methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Block truncated-Newton methods for parallel optimization
- A truncated Newton method with non-monotone line search for unconstrained optimization
- Some effective methods for unconstrained optimization based on the solution of systems of ordinary differential equations
- ODE versus SQP methods for constrained optimization
- Nonmonotonic trust region algorithm
- Curvilinear path steplength algorithms for minimization which use directions of negative curvature
- Testing Unconstrained Optimization Software
- Tensor Methods for Unconstrained Optimization Using Second Derivatives
- A modification of Armijo's step-size rule for negative curvature
- On the use of directions of negative curvature in a modified newton method
- Curvilinear Stabilization Techniques for Truncated Newton Methods in Large Scale Unconstrained Optimization
- Numerical Optimization
- Tensor Methods for Large, Sparse Unconstrained Optimization
- Exploiting negative curvature directions in linesearch methods for unconstrained optimization
This page was built for publication: A curvilinear search algorithm for unconstrained optimization by automatic differentiation