Exploiting negative curvature directions in linesearch methods for unconstrained optimization
From MaRDI portal
Publication:4521380
DOI10.1080/10556780008805794zbMath0988.90039OpenAlexW2081397705WikidataQ58185866 ScholiaQ58185866MaRDI QIDQ4521380
No author found.
Publication date: 16 July 2002
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556780008805794
Related Items
Adaptive Quadratically Regularized Newton Method for Riemannian Optimization, Second-order negative-curvature methods for box-constrained and general constrained optimization, Using improved directions of negative curvature for the solution of bound-constrained nonconvex problems, Preconditioning Newton-Krylov methods in nonconvex large scale optimization, Adaptive nonmonotone line search method for unconstrained optimization, Finding second-order stationary points in constrained minimization: a feasible direction approach, Iterative grossone-based computation of negative curvature directions in large-scale optimization, Exploiting negative curvature in deterministic and stochastic optimization, Using negative curvature in solving nonlinear programs, Conjugate direction methods and polarity for quadratic hypersurfaces, Optimal quotients for solving large eigenvalue problems, Iterative computation of negative curvature directions in large scale optimization, Conjugate gradient (CG)-type method for the solution of Newton's equation within optimization frameworks, Nonconvex optimization using negative curvature within a modified linesearch, Combining and scaling descent and negative curvature directions, A curvilinear search algorithm for unconstrained optimization by automatic differentiation, A curvilinear method based on minimal-memory BFGS updates, A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds, A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization, A second-order globally convergent direct-search method and its worst-case complexity, A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization, A symmetric rank-one quasi-Newton line-search method using negative curvature directions, Improving directions of negative curvature in an efficient manner, A likelihood-based boosting algorithm for factor analysis models with binary data, A dwindling filter line search method for unconstrained optimization, Planar conjugate gradient algorithm for large-scale unconstrained optimization. I: Theory, Planar conjugate gradient algorithm for large-scale unconstrained optimization. II: Application, A framework of conjugate direction methods for symmetric linear systems in optimization, The higher-order Levenberg–Marquardt method with Armijo type line search for nonlinear equations