Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
DOI10.1137/0801005zbMATH Open0752.90063OpenAlexW2059338280MaRDI QIDQ4017638FDOQ4017638
Authors: Héctor Martínez, J. Engels
Publication date: 16 January 1993
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0801005
Recommendations
- LOCAL AND SUPERLINEAR CONVERGENCE OF STRUCTURED QUASI-NEWTON METHODS FOR NONLINEAR OPTIMIZATION
- New results on superlinear convergence of classical quasi-Newton methods
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Publication:3201653
secant methodconstrained optimizationunconstrained optimizationnonlinear least-squaresbounded deteriorationlocal and \(q\)- superlinear convergence of quasi-Newton methods
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Cited In (22)
- On simulation and optimization of macroeconometric models
- Operations research and optimization (ORO)
- Greedy Quasi-Newton Methods with Explicit Superlinear Convergence
- Title not available (Why is that?)
- Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems
- New results on superlinear convergence of classical quasi-Newton methods
- Greedy PSB methods with explicit superlinear convergence
- Rates of superlinear convergence for classical quasi-Newton methods
- A new structured spectral conjugate gradient method for nonlinear least squares problems
- Exploiting additional structure in equality constrained optimization by structured SQP secant algorithms
- Numerical algorithms for constrained maximum likelihood estimation
- Scaled nonlinear conjugate gradient methods for nonlinear least squares problems
- A robust combined trust region–line search exact penalty projected structured scheme for constrained nonlinear least squares
- Structured symmetric rank-one method for unconstrained optimization
- Adaptive algorithm for constrained least-squares problems
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- A bounded deterioration property of a symmetric positive definite class of Newton-like methods and its application
- Local and superlinear convergence for truncated iterated projections methods
- A Kantorovich theorem for the structured PSB update in Hilbert space.
- A structured L-BFGS method and its application to inverse problems
- A Convergence Theory for a Class of Quasi-Newton Methods for Constrained Optimization
Uses Software
This page was built for publication: Local and Superlinear Convergence for Partially Known Quasi-Newton Methods
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4017638)