A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix
From MaRDI portal
Publication:1248807
DOI10.1016/0022-247X(78)90085-9zbMath0383.49024MaRDI QIDQ1248807
Publication date: 1978
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Newton-type methods (49M15)
Related Items (10)
Some effective methods for unconstrained optimization based on the solution of systems of ordinary differential equations ⋮ OPTAC: A portable software package for analyzing and comparing optimization methods by visualization ⋮ A comparison of methods for traversing regions of non-convexity in optimization problems ⋮ A trajectory-based method for constrained nonlinear optimization problems ⋮ A new arc algorithm for unconstrained optimization ⋮ A Newton-type curvilinear search method for constrained optimization ⋮ Experiments with new stochastic global optimization search techniques ⋮ A new super-memory gradient method with curve search rule ⋮ Note on global convergence of ODE methods for unconstrained optimization ⋮ Comparison of partition evaluation measures in an adaptive partitioning algorithm for global optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A class of differential descent methods for constrained optimization
- A Newton-type curvilinear search method for optimization
- Differential gradient methods
- A Rapidly Convergent Descent Method for Minimization
- Quasi-Newton Methods and their Application to Function Minimisation
- A new approach to variable metric algorithms
This page was built for publication: A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix