On the local convergence of a derivative-free algorithm for least-squares minimization
From MaRDI portal
Publication:429479
DOI10.1007/s10589-010-9367-xzbMath1268.90043MaRDI QIDQ429479
Andrew R. Conn, Hongchao Zhang
Publication date: 19 June 2012
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-010-9367-x
system of nonlinear equations; least-squares; local convergence; derivative-free optimization; Levenberg-Marquardt method; trust region; asymptotic convergence
90C56: Derivative-free methods and methods using generalized derivatives
90C20: Quadratic programming
Related Items
Unnamed Item, On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems, A Stochastic Levenberg--Marquardt Method Using Random Models with Complexity Results, Derivative-free optimization methods, An affine-scaling derivative-free trust-region method for solving nonlinear systems subject to linear inequality constraints, Local analysis of a spectral correction for the Gauss-Newton model applied to quadratic residual problems, An affine scaling derivative-free trust region method with interior backtracking technique for bounded-constrained nonlinear programming, Conjugate gradient path method without line search technique for derivative-free unconstrained optimization, An inexact derivative-free Levenberg-Marquardt method for linear inequality constrained nonlinear systems under local error bound conditions, A class of derivative-free trust-region methods with interior backtracking technique for nonlinear optimization problems subject to linear inequality constraints, A derivative-free trust region algorithm with nonmonotone filter technique for bound constrained optimization, Combined gradient methods for multiobjective optimization, Accelerated derivative-free nonlinear least-squares applied to the estimation of Manning coefficients, A brief survey of methods for solving nonlinear least-squares problems, A derivative-free Gauss-Newton method, Recent advances in trust region algorithms, An interior affine scaling cubic regularization algorithm for derivative-free optimization subject to bound constraints, Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Test examples for nonlinear programming codes
- On trust region methods for unconstrained minimization without derivatives
- On the quadratic convergence of the Levenberg-Marquardt method without nonsingularity assumption
- Least Frobenius norm updating of quadratic models that satisfy interpolation conditions
- Geometry of interpolation sets in derivative free optimization
- Self-adaptive inexact proximal point methods
- Convergence properties of a self-adaptive Levenberg-Marquardt algorithm under local error bound condition
- The Proximal Point Algorithm with Genuine Superlinear Convergence for the Monotone Complementarity Problem
- A Derivative-Free Algorithm for Least-Squares Minimization
- Asymptotic Convergence Analysis of a New Class of Proximal Point Methods
- Developments of NEWUOA for minimization without derivatives
- Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation
- Introduction to Derivative-Free Optimization
- Trust Region Methods
- Convergence Properties of the Inexact Levenberg-Marquardt Method under Local Error Bound Conditions
- Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points