Iterative computation of negative curvature directions in large scale optimization
From MaRDI portal
Publication:2457949
DOI10.1007/s10589-007-9034-zzbMath1171.90549OpenAlexW1990689599MaRDI QIDQ2457949
Publication date: 24 October 2007
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-007-9034-z
conjugate gradient methodlarge scale optimizationnegative curvature directionsconvergence to second order critical points
Applications of renewal theory (reliability, demand theory, etc.) (60K10) Methods of reduced gradient type (90C52)
Related Items
A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization, Preconditioning Newton-Krylov methods in nonconvex large scale optimization, An adaptive truncation criterion, for linesearch-based truncated Newton methods in large scale nonconvex optimization, Iterative grossone-based computation of negative curvature directions in large-scale optimization, Conjugate direction methods and polarity for quadratic hypersurfaces, An Improvement of the Pivoting Strategy in the Bunch and Kaufman Decomposition, Within Truncated Newton Methods, Polarity and conjugacy for quadratic hypersurfaces: a unified framework with recent advances, Iterative computation of negative curvature directions in large scale optimization, Novel preconditioners based on quasi-Newton updates for nonlinear conjugate gradient methods, A curvilinear method based on minimal-memory BFGS updates, Issues on the use of a modified bunch and Kaufman decomposition for large scale Newton's equation, A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization, A dwindling filter line search method for unconstrained optimization, A framework of conjugate direction methods for symmetric linear systems in optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A new three-term conjugate gradient method
- A composite step bi-conjugate gradient algorithm for nonsymmetric linear systems
- Numerical experiences with new truncated Newton methods in large scale unconstrained optimization
- A survey of truncated-Newton methods
- Nonmonotone curvilinear line search methods for unconstrained optimization
- Iterative computation of negative curvature directions in large scale optimization
- Memory gradient method for the minimization of functions
- Planar conjugate gradient algorithm for large-scale unconstrained optimization. I: Theory
- Planar conjugate gradient algorithm for large-scale unconstrained optimization. II: Application
- Convergence to Second Order Stationary Points in Inequality Constrained Optimization
- Computing a Trust Region Step
- A Family of Trust-Region-Based Algorithms for Unconstrained Minimization with Strong Global Convergence Properties
- Solution of Sparse Indefinite Systems of Linear Equations
- A modification of Armijo's step-size rule for negative curvature
- On the use of directions of negative curvature in a modified newton method
- Curvilinear Stabilization Techniques for Truncated Newton Methods in Large Scale Unconstrained Optimization
- Trust Region Methods
- Exploiting negative curvature directions in linesearch methods for unconstrained optimization
- Solving the Trust-Region Subproblem using the Lanczos Method
- CUTEr and SifDec